WorldWideScience

Sample records for accurate svm-based gene

  1. A New SVM-Based Modeling Method of Cabin Path Loss Prediction

    Xiaonan Zhao; Chunping Hou; Qing Wang

    2013-01-01

    A new modeling method of cabin path loss prediction based on support vector machine (SVM) is proposed in this paper. The method is trained with the path loss values of measured points inside the cabin and can be used to predict the path loss values of the unmeasured points. The experimental results demonstrate that our modeling method is more accurate than the curve fitting method. This SVM-based path loss prediction method makes the prediction much easier and more accurate, which covers perf...

  2. SVM-based glioma grading. Optimization by feature reduction analysis

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∝87%) while reducing the number of features by up to 98%. (orig.)

  3. SVM based layout retargeting for fast and regularized inverse lithography

    Kai-sheng LUO; Zheng SHI; Xiao-lang YAN; Zhen GENG

    2014-01-01

    Inverse lithography technology (ILT), also known as pixel-based optical proximity correction (PB-OPC), has shown promising capability in pushing the current 193 nm lithography to its limit. By treating the mask optimization process as an inverse problem in lithography, ILT provides a more complete exploration of the solution space and better pattern fidelity than the tradi-tional edge-based OPC. However, the existing methods of ILT are extremely time-consuming due to the slow convergence of the optimization process. To address this issue, in this paper we propose a support vector machine (SVM) based layout retargeting method for ILT, which is designed to generate a good initial input mask for the optimization process and promote the convergence speed. Supervised by optimized masks of training layouts generated by conventional ILT, SVM models are learned and used to predict the initial pixel values in the‘undefined areas’ of the new layout. By this process, an initial input mask close to the final optimized mask of the new layout is generated, which reduces iterations needed in the following optimization process. Manu-facturability is another critical issue in ILT;however, the mask generated by our layout retargeting method is quite irregular due to the prediction inaccuracy of the SVM models. To compensate for this drawback, a spatial filter is employed to regularize the retargeted mask for complexity reduction. We implemented our layout retargeting method with a regularized level-set based ILT (LSB-ILT) algorithm under partially coherent illumination conditions. Experimental results show that with an initial input mask generated by our layout retargeting method, the number of iterations needed in the optimization process and runtime of the whole process in ILT are reduced by 70.8%and 69.0%, respectively.

  4. Combinatorial Approaches to Accurate Identification of Orthologous Genes

    Shi, Guanqun

    2011-01-01

    The accurate identification of orthologous genes across different species is a critical and challenging problem in comparative genomics and has a wide spectrum of biological applications including gene function inference, evolutionary studies and systems biology. During the past several years, many methods have been proposed for ortholog assignment based on sequence similarity, phylogenetic approaches, synteny information, and genome rearrangement. Although these methods share many commonly a...

  5. PredSTP: a highly accurate SVM based model to predict sequential cystine stabilized peptides

    Islam, S. M. Ashiqul; Sajed, Tanvir; Kearney, Christopher Michel; Baker, Erich J.

    2015-01-01

    Background Numerous organisms have evolved a wide range of toxic peptides for self-defense and predation. Their effective interstitial and macro-environmental use requires energetic and structural stability. One successful group of these peptides includes a tri-disulfide domain arrangement that offers toxicity and high stability. Sequential tri-disulfide connectivity variants create highly compact disulfide folds capable of withstanding a variety of environmental stresses. Their combination o...

  6. svm PRAT: SVM-based Protein Residue Annotation Toolkit

    Rangwala, Huzefa; Kauffman, Christopher; Karypis, George

    2009-01-01

    Background Over the last decade several prediction methods have been developed for determining the structural and functional properties of individual protein residues using sequence and sequence-derived information. Most of these methods are based on support vector machines as they provide accurate and generalizable prediction models. Results We present a general purpose protein residue annotation toolkit (svm PRAT) to allow biologists to formulate residue-wise prediction problems. svm PRAT f...

  7. Advances in SVM-Based System Using GMM Super Vectors for Text-Independent Speaker Verification

    ZHAO Jian; DONG Yuan; ZHAO Xianyu; YANG Hao; LU Liang; WANG Haila

    2008-01-01

    For text-independent speaker verification,the Gaussian mixture model (GMM) using a universal background model strategy and the GMM using support vector machines are the two most commonly used methodologies.Recently,a new SVM-based speaker verification method using GMM super vectors has been proposed.This paper describes the construction of a new speaker verification system and investigates the use of nuisance attribute projection and test normalization to further enhance performance.Experiments were conducted on the core test of the 2006 NIST speaker recognition evaluation corpus.The experimental results indicate that an SVM-based speaker verification system using GMM super vectors can achieve ap-pealing performance.With the use of nuisance attribute projection and test normalization,the system per-formance can be significantly improved,with improvements in the equal error rate from 7.78% to 4.92% and detection cost function from 0.0376 to 0.0251.

  8. Diagnosis of Elevator Faults with LS-SVM Based on Optimization by K-CV

    Zhou Wan

    2015-01-01

    Full Text Available Several common elevator malfunctions were diagnosed with a least square support vector machine (LS-SVM. After acquiring vibration signals of various elevator functions, their energy characteristics and time domain indicators were extracted by theoretically analyzing the optimal wavelet packet, in order to construct a feature vector of malfunctions for identifying causes of the malfunctions as input of LS-SVM. Meanwhile, parameters about LS-SVM were optimized by K-fold cross validation (K-CV. After diagnosing deviated elevator guide rail, deviated shape of guide shoe, abnormal running of tractor, erroneous rope groove of traction sheave, deviated guide wheel, and tension of wire rope, the results suggested that the LS-SVM based on K-CV optimization was one of effective methods for diagnosing elevator malfunctions.

  9. Accurate and unambiguous tag-to-gene mapping in serial analysis of gene expression

    Melo Francisco

    2006-11-01

    Full Text Available Abstract Background In this study, we present a robust and reliable computational method for tag-to-gene assignment in serial analysis of gene expression (SAGE. The method relies on current genome information and annotation, incorporation of several new features, and key improvements over alternative methods, all of which are important to determine gene expression levels more accurately. The method provides a complete annotation of potential virtual SAGE tags within a genome, along with an estimation of their confidence for experimental observation that ranks tags that present multiple matches in the genome. Results We applied this method to the Saccharomyces cerevisiae genome, producing the most thorough and accurate annotation of potential virtual SAGE tags that is available today for this organism. The usefulness of this method is exemplified by the significant reduction of ambiguous cases in existing experimental SAGE data. In addition, we report new insights from the analysis of existing SAGE data. First, we found that experimental SAGE tags mapping onto introns, intron-exon boundaries, and non-coding RNA elements are observed in all available SAGE data. Second, a significant fraction of experimental SAGE tags was found to map onto genomic regions currently annotated as intergenic. Third, a significant number of existing experimental SAGE tags for yeast has been derived from truncated cDNAs, which are synthesized through oligo-d(T priming to internal poly-(A regions during reverse transcription. Conclusion We conclude that an accurate and unambiguous tag mapping process is essential to increase the quality and the amount of information that can be extracted from SAGE experiments. This is supported by the results obtained here and also by the large impact that the erroneous interpretation of these data could have on downstream applications.

  10. Accurate prediction of secondary metabolite gene clusters in filamentous fungi

    Andersen, Mikael Rørdam; Nielsen, Jakob Blæsbjerg; Klitgaard, Andreas; Petersen, Lene Maj; Zachariasen, Mia; Hansen, Tilde J; Holberg Blicher, Lene; Gotfredsen, Charlotte Held; Larsen, Thomas Ostenfeld; Nielsen, Kristian Fog; Mortensen, Uffe Hasbro

    2013-01-01

    Biosynthetic pathways of secondary metabolites from fungi are currently subject to an intense effort to elucidate the genetic basis for these compounds due to their large potential within pharmaceutics and synthetic biochemistry. The preferred method is methodical gene deletions to identify...... used A. nidulans for our method development and validation due to the wealth of available biochemical data, but the method can be applied to any fungus with a sequenced and assembled genome, thus supporting further secondary metabolite pathway elucidation in the fungal kingdom....

  11. PCA-SVM based CAD System for Focal Liver Lesions using B-Mode Ultrasound Images

    Jitendra Virmani

    2013-09-01

    Full Text Available The contribution made by texture of regions inside and outside of the lesions in classification of focal liver lesions (FLLs is investigated in the present work. In order to design an efficient computer-aided diagnostic (CAD system for FLLs, a representative database consisting of images with (1 typical and atypical cases of cyst, hemangioma (HEM and metastatic carcinoma (MET lesions, (2 small as well as large hepatocellular carcinoma (HCC lesions and (3 normal (NOR liver tissue is used. Texture features are computed from regions inside and outside of the lesions. Feature set consisting of 208 texture features, (i.e. 104 texture features and 104 texture ratio features is subjected to principal component analysis (PCA for finding the optimal number of principal components to train a support vector machine (SVM classifier for the classification task. The proposed PCA-SVM based CAD system yielded classification accuracy of 87.2% with the individual class accuracy of 85%, 96%, 90%, 87.5% and 82.2% for NOR, Cyst, HEM, HCC and MET cases respectively. The accuracy for typical, atypical, small HCC and large HCC cases is 87.5%, 86.8%, 88.8%, and 87% respectively. The promising results indicate usefulness of the CAD system for assisting radiologists in diagnosis of FLLs.Defence Science Journal, 2013, 63(5, pp.478-486, DOI:http://dx.doi.org/10.14429/dsj.63.3951

  12. PCA-SVM based CAD System for Focal Liver Lesions using B-mode Ultrasound Images

    Jitendra Virmani

    2013-09-01

    Full Text Available The contribution made by texture of regions inside and outside of the lesions in classification of focal liver lesions (FLLs is investigated in the present work. In order to design an efficient computer-aided diagnostic (CAD system for FLLs, a representative database consisting of images with (1 typical and atypical cases of cyst, hemangioma (HEM and metastatic carcinoma (MET lesions, (2 small as well as large hepatocellular carcinoma (HCC lesions and (3 normal (NOR liver tissue is used. Texture features are computed from regions inside and outside of the lesions. Feature set consisting of 208 texture features, (i.e. 104 texture features and 104 texture ratio features is subjected to principal component analysis (PCA for finding the optimal number of principal components to train a support vector machine (SVM classifier for the classification task. The proposed PCA-SVM based CAD system yielded classification accuracy of 87.2% with the individual class accuracy of 85%, 96%, 90%, 87.5% and 82.2% for NOR, Cyst, HEM, HCC and MET cases respectively. The accuracy for typical, atypical, small HCC and large HCC cases is 87.5%, 86.8%, 88.8%, and 87% respectively. The promising results indicate usefulness of the CAD system for assisting radiologists in diagnosis of FLLs.

  13. PSO-SVM-Based Online Locomotion Mode Identification for Rehabilitation Robotic Exoskeletons.

    Long, Yi; Du, Zhi-Jiang; Wang, Wei-Dong; Zhao, Guang-Yu; Xu, Guo-Qiang; He, Long; Mao, Xi-Wang; Dong, Wei

    2016-01-01

    Locomotion mode identification is essential for the control of a robotic rehabilitation exoskeletons. This paper proposes an online support vector machine (SVM) optimized by particle swarm optimization (PSO) to identify different locomotion modes to realize a smooth and automatic locomotion transition. A PSO algorithm is used to obtain the optimal parameters of SVM for a better overall performance. Signals measured by the foot pressure sensors integrated in the insoles of wearable shoes and the MEMS-based attitude and heading reference systems (AHRS) attached on the shoes and shanks of leg segments are fused together as the input information of SVM. Based on the chosen window whose size is 200 ms (with sampling frequency of 40 Hz), a three-layer wavelet packet analysis (WPA) is used for feature extraction, after which, the kernel principal component analysis (kPCA) is utilized to reduce the dimension of the feature set to reduce computation cost of the SVM. Since the signals are from two types of different sensors, the normalization is conducted to scale the input into the interval of [0, 1]. Five-fold cross validation is adapted to train the classifier, which prevents the classifier over-fitting. Based on the SVM model obtained offline in MATLAB, an online SVM algorithm is constructed for locomotion mode identification. Experiments are performed for different locomotion modes and experimental results show the effectiveness of the proposed algorithm with an accuracy of 96.00% ± 2.45%. To improve its accuracy, majority vote algorithm (MVA) is used for post-processing, with which the identification accuracy is better than 98.35% ± 1.65%. The proposed algorithm can be extended and employed in the field of robotic rehabilitation and assistance. PMID:27598160

  14. Evaluation of new reference genes in papaya for accurate transcript normalization under different experimental conditions.

    Xiaoyang Zhu

    Full Text Available Real-time reverse transcription PCR (RT-qPCR is a preferred method for rapid and accurate quantification of gene expression studies. Appropriate application of RT-qPCR requires accurate normalization though the use of reference genes. As no single reference gene is universally suitable for all experiments, thus reference gene(s validation under different experimental conditions is crucial for RT-qPCR analysis. To date, only a few studies on reference genes have been done in other plants but none in papaya. In the present work, we selected 21 candidate reference genes, and evaluated their expression stability in 246 papaya fruit samples using three algorithms, geNorm, NormFinder and RefFinder. The samples consisted of 13 sets collected under different experimental conditions, including various tissues, different storage temperatures, different cultivars, developmental stages, postharvest ripening, modified atmosphere packaging, 1-methylcyclopropene (1-MCP treatment, hot water treatment, biotic stress and hormone treatment. Our results demonstrated that expression stability varied greatly between reference genes and that different suitable reference gene(s or combination of reference genes for normalization should be validated according to the experimental conditions. In general, the internal reference genes EIF (Eukaryotic initiation factor 4A, TBP1 (TATA binding protein 1 and TBP2 (TATA binding protein 2 genes had a good performance under most experimental conditions, whereas the most widely present used reference genes, ACTIN (Actin 2, 18S rRNA (18S ribosomal RNA and GAPDH (Glyceraldehyde-3-phosphate dehydrogenase were not suitable in many experimental conditions. In addition, two commonly used programs, geNorm and Normfinder, were proved sufficient for the validation. This work provides the first systematic analysis for the selection of superior reference genes for accurate transcript normalization in papaya under different experimental

  15. Fast and Accurate Large-Scale Detection of β-Lactamase Genes Conferring Antibiotic Resistance

    Lee, Jae Jin; Lee, Jung Hun; Kwon, Dae Beom; Jeon, Jeong Ho; Park, Kwang Seung; Lee, Chang-Ro; Lee, Sang Hee

    2015-01-01

    Fast detection of β-lactamase (bla) genes allows improved surveillance studies and infection control measures, which can minimize the spread of antibiotic resistance. Although several molecular diagnostic methods have been developed to detect limited bla gene types, these methods have significant limitations, such as their failure to detect almost all clinically available bla genes. We developed a fast and accurate molecular method to overcome these limitations using 62 primer pairs, which we...

  16. Modeling the milling tool wear by using an evolutionary SVM-based model from milling runs experimental data

    Nieto, Paulino José García; García-Gonzalo, Esperanza; Vilán, José Antonio Vilán; Robleda, Abraham Segade

    2015-12-01

    The main aim of this research work is to build a new practical hybrid regression model to predict the milling tool wear in a regular cut as well as entry cut and exit cut of a milling tool. The model was based on Particle Swarm Optimization (PSO) in combination with support vector machines (SVMs). This optimization mechanism involved kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. Bearing this in mind, a PSO-SVM-based model, which is based on the statistical learning theory, was successfully used here to predict the milling tool flank wear (output variable) as a function of the following input variables: the time duration of experiment, depth of cut, feed, type of material, etc. To accomplish the objective of this study, the experimental dataset represents experiments from runs on a milling machine under various operating conditions. In this way, data sampled by three different types of sensors (acoustic emission sensor, vibration sensor and current sensor) were acquired at several positions. A second aim is to determine the factors with the greatest bearing on the milling tool flank wear with a view to proposing milling machine's improvements. Firstly, this hybrid PSO-SVM-based regression model captures the main perception of statistical learning theory in order to obtain a good prediction of the dependence among the flank wear (output variable) and input variables (time, depth of cut, feed, etc.). Indeed, regression with optimal hyperparameters was performed and a determination coefficient of 0.95 was obtained. The agreement of this model with experimental data confirmed its good performance. Secondly, the main advantages of this PSO-SVM-based model are its capacity to produce a simple, easy-to-interpret model, its ability to estimate the contributions of the input variables, and its computational efficiency. Finally, the main conclusions of this study are exposed.

  17. An SVM-Based Classifier for Estimating the State of Various Rotating Components in Agro-Industrial Machinery with a Vibration Signal Acquired from a Single Point on the Machine Chassis

    Ruben Ruiz-Gonzalez

    2014-11-01

    Full Text Available The goal of this article is to assess the feasibility of estimating the state of various rotating components in agro-industrial machinery by employing just one vibration signal acquired from a single point on the machine chassis. To do so, a Support Vector Machine (SVM-based system is employed. Experimental tests evaluated this system by acquiring vibration data from a single point of an agricultural harvester, while varying several of its working conditions. The whole process included two major steps. Initially, the vibration data were preprocessed through twelve feature extraction algorithms, after which the Exhaustive Search method selected the most suitable features. Secondly, the SVM-based system accuracy was evaluated by using Leave-One-Out cross-validation, with the selected features as the input data. The results of this study provide evidence that (i accurate estimation of the status of various rotating components in agro-industrial machinery is possible by processing the vibration signal acquired from a single point on the machine structure; (ii the vibration signal can be acquired with a uniaxial accelerometer, the orientation of which does not significantly affect the classification accuracy; and, (iii when using an SVM classifier, an 85% mean cross-validation accuracy can be reached, which only requires a maximum of seven features as its input, and no significant improvements are noted between the use of either nonlinear or linear kernels.

  18. An SVM-based classifier for estimating the state of various rotating components in agro-industrial machinery with a vibration signal acquired from a single point on the machine chassis.

    Ruiz-Gonzalez, Ruben; Gomez-Gil, Jaime; Gomez-Gil, Francisco Javier; Martínez-Martínez, Víctor

    2014-01-01

    The goal of this article is to assess the feasibility of estimating the state of various rotating components in agro-industrial machinery by employing just one vibration signal acquired from a single point on the machine chassis. To do so, a Support Vector Machine (SVM)-based system is employed. Experimental tests evaluated this system by acquiring vibration data from a single point of an agricultural harvester, while varying several of its working conditions. The whole process included two major steps. Initially, the vibration data were preprocessed through twelve feature extraction algorithms, after which the Exhaustive Search method selected the most suitable features. Secondly, the SVM-based system accuracy was evaluated by using Leave-One-Out cross-validation, with the selected features as the input data. The results of this study provide evidence that (i) accurate estimation of the status of various rotating components in agro-industrial machinery is possible by processing the vibration signal acquired from a single point on the machine structure; (ii) the vibration signal can be acquired with a uniaxial accelerometer, the orientation of which does not significantly affect the classification accuracy; and, (iii) when using an SVM classifier, an 85% mean cross-validation accuracy can be reached, which only requires a maximum of seven features as its input, and no significant improvements are noted between the use of either nonlinear or linear kernels. PMID:25372618

  19. An accurate DNA marker assay for stem rust resistance gene Sr2 in wheat

    The stem rust resistance gene Sr2 has provided broad-spectrum protection against stem rust (Puccinia graminis) since its wide spread deployment in wheat from the 1940s. Because Sr2 confers partial resistance which is difficult to select under field conditions, a DNA marker is desirable that accurate...

  20. Hybrid PSO–SVM-based method for forecasting of the remaining useful life for aircraft engines and evaluation of its reliability

    The present paper describes a hybrid PSO–SVM-based model for the prediction of the remaining useful life of aircraft engines. The proposed hybrid model combines support vector machines (SVMs), which have been successfully adopted for regression problems, with the particle swarm optimization (PSO) technique. This optimization technique involves kernel parameter setting in the SVM training procedure, which significantly influences the regression accuracy. However, its use in reliability applications has not been yet widely explored. Bearing this in mind, remaining useful life values have been predicted here by using the hybrid PSO–SVM-based model from the remaining measured parameters (input variables) for aircraft engines with success. A coefficient of determination equal to 0.9034 was obtained when this hybrid PSO–RBF–SVM-based model was applied to experimental data. The agreement of this model with experimental data confirmed its good performance. One of the main advantages of this predictive model is that it does not require information about the previous operation states of the engine. Finally, the main conclusions of this study are exposed. - Highlights: • A hybrid PSO–SVM-based model is built as a predictive model of the RUL values for aircraft engines. • The remaining physical–chemical variables in this process are studied in depth. • The obtained regression accuracy of our method is about 95%. • The results show that PSO–SVM-based model can assist in the diagnosis of the RUL values with accuracy

  1. Coding exon-structure aware realigner (CESAR) utilizes genome alignments for accurate comparative gene annotation.

    Sharma, Virag; Elghafari, Anas; Hiller, Michael

    2016-06-20

    Identifying coding genes is an essential step in genome annotation. Here, we utilize existing whole genome alignments to detect conserved coding exons and then map gene annotations from one genome to many aligned genomes. We show that genome alignments contain thousands of spurious frameshifts and splice site mutations in exons that are truly conserved. To overcome these limitations, we have developed CESAR (Coding Exon-Structure Aware Realigner) that realigns coding exons, while considering reading frame and splice sites of each exon. CESAR effectively avoids spurious frameshifts in conserved genes and detects 91% of shifted splice sites. This results in the identification of thousands of additional conserved exons and 99% of the exons that lack inactivating mutations match real exons. Finally, to demonstrate the potential of using CESAR for comparative gene annotation, we applied it to 188 788 exons of 19 865 human genes to annotate human genes in 99 other vertebrates. These comparative gene annotations are available as a resource (http://bds.mpi-cbg.de/hillerlab/CESAR/). CESAR (https://github.com/hillerlab/CESAR/) can readily be applied to other alignments to accurately annotate coding genes in many other vertebrate and invertebrate genomes. PMID:27016733

  2. In-Vivo Imaging of Cell Migration Using Contrast Enhanced MRI and SVM Based Post-Processing.

    Christian Weis

    Full Text Available The migration of cells within a living organism can be observed with magnetic resonance imaging (MRI in combination with iron oxide nanoparticles as an intracellular contrast agent. This method, however, suffers from low sensitivity and specificty. Here, we developed a quantitative non-invasive in-vivo cell localization method using contrast enhanced multiparametric MRI and support vector machines (SVM based post-processing. Imaging phantoms consisting of agarose with compartments containing different concentrations of cancer cells labeled with iron oxide nanoparticles were used to train and evaluate the SVM for cell localization. From the magnitude and phase data acquired with a series of T2*-weighted gradient-echo scans at different echo-times, we extracted features that are characteristic for the presence of superparamagnetic nanoparticles, in particular hyper- and hypointensities, relaxation rates, short-range phase perturbations, and perturbation dynamics. High detection quality was achieved by SVM analysis of the multiparametric feature-space. The in-vivo applicability was validated in animal studies. The SVM detected the presence of iron oxide nanoparticles in the imaging phantoms with high specificity and sensitivity with a detection limit of 30 labeled cells per mm3, corresponding to 19 μM of iron oxide. As proof-of-concept, we applied the method to follow the migration of labeled cancer cells injected in rats. The combination of iron oxide labeled cells, multiparametric MRI and a SVM based post processing provides high spatial resolution, specificity, and sensitivity, and is therefore suitable for non-invasive in-vivo cell detection and cell migration studies over prolonged time periods.

  3. Validation of Reference Genes for Accurate Normalization of Gene Expression in Lilium davidii var. unicolor for Real Time Quantitative PCR.

    XueYan Li

    Full Text Available Lilium is an important commercial market flower bulb. qRT-PCR is an extremely important technique to track gene expression levels. The requirement of suitable reference genes for normalization has become increasingly significant and exigent. The expression of internal control genes in living organisms varies considerably under different experimental conditions. For economically important Lilium, only a limited number of reference genes applied in qRT-PCR have been reported to date. In this study, the expression stability of 12 candidate genes including α-TUB, β-TUB, ACT, eIF, GAPDH, UBQ, UBC, 18S, 60S, AP4, FP, and RH2, in a diverse set of 29 samples representing different developmental processes, three stress treatments (cold, heat, and salt and different organs, has been evaluated. For different organs, the combination of ACT, GAPDH, and UBQ is appropriate whereas ACT together with AP4, or ACT along with GAPDH is suitable for normalization of leaves and scales at different developmental stages, respectively. In leaves, scales and roots under stress treatments, FP, ACT and AP4, respectively showed the most stable expression. This study provides a guide for the selection of a reference gene under different experimental conditions, and will benefit future research on more accurate gene expression studies in a wide variety of Lilium genotypes.

  4. svmPRAT: SVM-based Protein Residue Annotation Toolkit

    Kauffman Christopher; Rangwala Huzefa; Karypis George

    2009-01-01

    Abstract Background Over the last decade several prediction methods have been developed for determining the structural and functional properties of individual protein residues using sequence and sequence-derived information. Most of these methods are based on support vector machines as they provide accurate and generalizable prediction models. Results We present a general purpose protein residue annotation toolkit (svmPRAT) to allow biologists to formulate residue-wise prediction problems. sv...

  5. Gene expression signatures of radiation response are specific, durable and accurate in mice and humans.

    Sarah K Meadows

    Full Text Available BACKGROUND: Previous work has demonstrated the potential for peripheral blood (PB gene expression profiling for the detection of disease or environmental exposures. METHODS AND FINDINGS: We have sought to determine the impact of several variables on the PB gene expression profile of an environmental exposure, ionizing radiation, and to determine the specificity of the PB signature of radiation versus other genotoxic stresses. Neither genotype differences nor the time of PB sampling caused any lessening of the accuracy of PB signatures to predict radiation exposure, but sex difference did influence the accuracy of the prediction of radiation exposure at the lowest level (50 cGy. A PB signature of sepsis was also generated and both the PB signature of radiation and the PB signature of sepsis were found to be 100% specific at distinguishing irradiated from septic animals. We also identified human PB signatures of radiation exposure and chemotherapy treatment which distinguished irradiated patients and chemotherapy-treated individuals within a heterogeneous population with accuracies of 90% and 81%, respectively. CONCLUSIONS: We conclude that PB gene expression profiles can be identified in mice and humans that are accurate in predicting medical conditions, are specific to each condition and remain highly accurate over time.

  6. A PSO-SVM-based 24 Hours Power Load Forecasting Model

    Yu Xiaoxu

    2015-01-01

    Full Text Available In order to improve the drawbacks of over-fitting and easily get stuck into local extremes of BACK propagation Neural Network, a new method of combination of wavelet transform and PSO-SVM (Particle Swarm Optimization- Support Vector Machine power load forecasting model is proposed. By employing wave-let transform, the authors decompose the time sequences of power load into high-frequency and low-frequency parts, namely the low-frequency part forecast with this model and the high-frequency part forecast with weighted average method. With PSO, which is a heuristic bionic optimization algorithm, the authors figure out the prefer-able parameters of SVM, and the model proposed in this paper is tested to be more accurately to forecast the 24h power load than BP model.

  7. A Statistical Parameter Analysis and SVM Based Fault Diagnosis Strategy for Dynamically Tuned Gyroscopes

    2007-01-01

    Gyro's fault diagnosis plays a critical role in inertia navigation systems for higher reliability and precision. A new fault diagnosis strategy based on the statistical parameter analysis (SPA) and support vector machine(SVM) classification model was proposed for dynamically tuned gyroscopes (DTG). The SPA, a kind of time domain analysis approach, was introduced to compute a set of statistical parameters of vibration signal as the state features of DTG, with which the SVM model, a novel learning machine based on statistical learning theory (SLT), was applied and constructed to train and identify the working state of DTG. The experimental results verify that the proposed diagnostic strategy can simply and effectively extract the state features of DTG, and it outperforms the radial-basis function (RBF) neural network based diagnostic method and can more reliably and accurately diagnose the working state of DTG.

  8. Optimal training dataset composition for SVM-based, age-independent, automated epileptic seizure detection.

    Bogaarts, J G; Gommer, E D; Hilkman, D M W; van Kranen-Mastenbroek, V H J M; Reulen, J P H

    2016-08-01

    -independent seizure detection is possible by training one classifier on EEG data from both neonatal and adult patients. Furthermore, our results indicate that for accurate age-independent seizure detection, it is important that EEG data from each age category are used for classifier training. This is particularly important for neonatal seizure detection. Our results underline the under-appreciated importance of training dataset composition with respect to accurate age-independent seizure detection. PMID:27032931

  9. l2 Multiple Kernel Fuzzy SVM-Based Data Fusion for Improving Peptide Identification.

    Jian, Ling; Xia, Zhonghang; Niu, Xinnan; Liang, Xijun; Samir, Parimal; Link, Andrew J

    2016-01-01

    SEQUEST is a database-searching engine, which calculates the correlation score between observed spectrum and theoretical spectrum deduced from protein sequences stored in a flat text file, even though it is not a relational and object-oriental repository. Nevertheless, the SEQUEST score functions fail to discriminate between true and false PSMs accurately. Some approaches, such as PeptideProphet and Percolator, have been proposed to address the task of distinguishing true and false PSMs. However, most of these methods employ time-consuming learning algorithms to validate peptide assignments [1] . In this paper, we propose a fast algorithm for validating peptide identification by incorporating heterogeneous information from SEQUEST scores and peptide digested knowledge. To automate the peptide identification process and incorporate additional information, we employ l2 multiple kernel learning (MKL) to implement the current peptide identification task. Results on experimental datasets indicate that compared with state-of-the-art methods, i.e., PeptideProphet and Percolator, our data fusing strategy has comparable performance but reduces the running time significantly. PMID:26394437

  10. Online Adaptive Error Compensation SVM-Based Sliding Mode Control of an Unmanned Aerial Vehicle

    Kaijia Xue

    2016-01-01

    Full Text Available Unmanned Aerial Vehicle (UAV is a nonlinear dynamic system with uncertainties and noises. Therefore, an appropriate control system has an obligation to ensure the stabilization and navigation of UAV. This paper mainly discusses the control problem of quad-rotor UAV system, which is influenced by unknown parameters and noises. Besides, a sliding mode control based on online adaptive error compensation support vector machine (SVM is proposed for stabilizing quad-rotor UAV system. Sliding mode controller is established through analyzing quad-rotor dynamics model in which the unknown parameters are computed by offline SVM. During this process, the online adaptive error compensation SVM method is applied in this paper. As modeling errors and noises both exist in the process of flight, the offline SVM one-time mode cannot predict the uncertainties and noises accurately. The control law is adjusted in real-time by introducing new training sample data to online adaptive SVM in the control process, so that the stability and robustness of flight are ensured. It can be demonstrated through the simulation experiments that the UAV that joined online adaptive SVM can track the changing path faster according to its dynamic model. Consequently, the proposed method that is proved has the better control effect in the UAV system.

  11. A Genetic Algorithm Optimized Decision Tree-SVM based Stock Market Trend Prediction System

    Binoy B. Nair

    2010-12-01

    Full Text Available Prediction of stock market trends has been an area of great interest both to researchers attempting to uncover the information hidden in the stock market data and for those who wish to profit by trading stocks. The extremely nonlinear nature of the stock market data makes it very difficult to design a system that can predict the future direction of the stock market with sufficient accuracy. This work presents a data mining based stock market trend prediction system, which produces highly accurate stock market forecasts. The proposed system is a genetic algorithm optimized decision tree-support vector machine (SVM hybrid, which can predict one-day-ahead trends in stockmarkets. The uniqueness of the proposed system lies in the use ofthe hybrid system which can adapt itself to the changing market conditions and in the fact that while most of the attempts at stockmarket trend prediction have approached it as a regression problem, present study converts the trend prediction task into a classification problem, thus improving the prediction accuracysignificantly. Performance of the proposed hybrid system isvalidated on the historical time series data from the Bombaystock exchange sensitive index (BSE-Sensex. The system performance is then compared to that of an artificial neural network (ANN based system and a naïve Bayes based system. It is found that the trend prediction accuracy is highest for the hybrid system and the genetic algorithm optimized decision tree- SVM hybrid system outperforms both the artificial neural network and the naïve bayes based trend prediction systems.

  12. LS-SVM Based AGC of an Asynchronous Power System with Dynamic Participation from DFIG Based Wind Turbines

    Gulshan Sharma

    2014-08-01

    Full Text Available Modern power systems are large and interconnected with growing trends to integrate wind energy to the power system and meet the ever rising energy demand in an economical manner. The penetration of wind energy has motivated power engineers and researchers to investigate the dynamic participation of Doubly Fed Induction Generators (DFIG based wind turbines in Automatic Generation Control (AGC services. However, with dynamic participation of DFIG, the AGC problem becomes more complex and under these conditions classical AGC are not suitable. Therefore, a new non-linear Least Squares Support Vector Machines (LS-SVM based regulator for solution of AGC problem is proposed in this study. The proposed AGC regulator is trained for a wide range of operating conditions and load changes using an off-line data set generated from the robust control technique. A two-area power system connected via parallel AC/DC tie-lines with DFIG based wind turbines in each area is considered to demonstrate the effectiveness of the proposed AGC regulator and compared with results obtained using Multi-Layer Perceptron (MLP neural networks and conventional PI regulators under various operating conditions and load changes.

  13. Gene expression anti-profiles as a basis for accurate universal cancer signatures

    Corrada Bravo Héctor

    2012-10-01

    Full Text Available Abstract Background Early screening for cancer is arguably one of the greatest public health advances over the last fifty years. However, many cancer screening tests are invasive (digital rectal exams, expensive (mammograms, imaging or both (colonoscopies. This has spurred growing interest in developing genomic signatures that can be used for cancer diagnosis and prognosis. However, progress has been slowed by heterogeneity in cancer profiles and the lack of effective computational prediction tools for this type of data. Results We developed anti-profiles as a first step towards translating experimental findings suggesting that stochastic across-sample hyper-variability in the expression of specific genes is a stable and general property of cancer into predictive and diagnostic signatures. Using single-chip microarray normalization and quality assessment methods, we developed an anti-profile for colon cancer in tissue biopsy samples. To demonstrate the translational potential of our findings, we applied the signature developed in the tissue samples, without any further retraining or normalization, to screen patients for colon cancer based on genomic measurements from peripheral blood in an independent study (AUC of 0.89. This method achieved higher accuracy than the signature underlying commercially available peripheral blood screening tests for colon cancer (AUC of 0.81. We also confirmed the existence of hyper-variable genes across a range of cancer types and found that a significant proportion of tissue-specific genes are hyper-variable in cancer. Based on these observations, we developed a universal cancer anti-profile that accurately distinguishes cancer from normal regardless of tissue type (ten-fold cross-validation AUC > 0.92. Conclusions We have introduced anti-profiles as a new approach for developing cancer genomic signatures that specifically takes advantage of gene expression heterogeneity. We have demonstrated that anti

  14. Importance of Housekeeping gene selection for accurate RT-qPCR in a wound healing model

    Turabelidze, Anna; Guo, Shujuan; DiPietro, Luisa A.

    2010-01-01

    Studies in the field of wound healing have utilized a variety of different housekeeping genes for RT-qPCR analysis. However, nearly all of these studies assume that the selected normalization gene is stably expressed throughout the course of the repair process. The purpose of our current investigation was to identify the most stable housekeeping genes for studying gene expression in mouse wound healing using RT-qPCR. To identify which housekeeping genes are optimal for studying gene expressio...

  15. A simple and accurate two-step long DNA sequences synthesis strategy to improve heterologous gene expression in pichia.

    Jiang-Ke Yang

    Full Text Available In vitro gene chemical synthesis is a powerful tool to improve the expression of gene in heterologous system. In this study, a two-step gene synthesis strategy that combines an assembly PCR and an overlap extension PCR (AOE was developed. In this strategy, the chemically synthesized oligonucleotides were assembled into several 200-500 bp fragments with 20-25 bp overlap at each end by assembly PCR, and then an overlap extension PCR was conducted to assemble all these fragments into a full length DNA sequence. Using this method, we de novo designed and optimized the codon of Rhizopus oryzae lipase gene ROL (810 bp and Aspergillus niger phytase gene phyA (1404 bp. Compared with the original ROL gene and phyA gene, the codon-optimized genes expressed at a significantly higher level in yeasts after methanol induction. We believe this AOE method to be of special interest as it is simple, accurate and has no limitation with respect to the size of the gene to be synthesized. Combined with de novo design, this method allows the rapid synthesis of a gene optimized for expression in the system of choice and production of sufficient biological material for molecular characterization and biotechnological application.

  16. 基于支持向量机无限集成学习方法的遥感图像分类%Remotely sensed imagery classification by SVM-based Infinite Ensemble Learning method

    杨娜; 秦志远; 张俊

    2013-01-01

    基于支持向量机的无限集成学习方法(SVM-based IEL)是机器学习领域新兴起的一种集成学习方法.本文将SVM-based IEL引入遥感图像的分类领域,并同时将SVM、Bagging、AdaBoost和SVM-based IEL等方法应用于遥感图像分类.实验表明:Bagging方法可以提高遥感图像的分类精度,而AdaBoost却降低了遥感图像的分类精度;同时,与SVM、有限集成的学习方法相比,SVM-based IEL方法具有可以显著地提高遥感图像的分类精度、分类效率的优势.%Support-vector-machines-based Infinite Ensemble Learning method ( SVM-based IEL) is one of the ensemble learning methods in the field of machine learning. In this paper, the SVM-based IEL was applied to the classification of remotely sensed imagery besides classic ensemble learning methods such as Bagging, AdaBoost and SVM etc. SVM was taken as the base classifier in Bagging, AdaBoost The experiments showed that the classic ensemble learning methods have different performances compared to SVM. In detail , the Bagging was capable of enhancing the classification accuracy but the AdaBoost was decreasing the classification accuracy. Furthermore, the experiments suggested that compared to SVM and classic ensemble learning methods, SVM-based IEL has many merits such as increasing both of the classification accuracy and classification efficiency.

  17. Normalization with genes encoding ribosomal proteins but not GAPDH provides an accurate quantification of gene expressions in neuronal differentiation of PC12 cells

    Lim Qing-En

    2010-01-01

    Full Text Available Abstract Background Gene regulation at transcript level can provide a good indication of the complex signaling mechanisms underlying physiological and pathological processes. Transcriptomic methods such as microarray and quantitative real-time PCR require stable reference genes for accurate normalization of gene expression. Some but not all studies have shown that housekeeping genes (HGKs, β-actin (ACTB and glyceraldehyde-3-phosphate dehydrogenase (GAPDH, which are routinely used for normalization, may vary significantly depending on the cell/tissue type and experimental conditions. It is currently unclear if these genes are stably expressed in cells undergoing drastic morphological changes during neuronal differentiation. Recent meta-analysis of microarray datasets showed that some but not all of the ribosomal protein genes are stably expressed. To test the hypothesis that some ribosomal protein genes can serve as reference genes for neuronal differentiation, a genome-wide analysis was performed and putative reference genes were identified based on stability of expressions. The stabilities of these potential reference genes were then analyzed by reverse transcription quantitative real-time PCR in six differentiation conditions. Results Twenty stably expressed genes, including thirteen ribosomal protein genes, were selected from microarray analysis of the gene expression profiles of GDNF and NGF induced differentiation of PC12 cells. The expression levels of these candidate genes as well as ACTB and GAPDH were further analyzed by reverse transcription quantitative real-time PCR in PC12 cells differentiated with a variety of stimuli including NGF, GDNF, Forskolin, KCl and ROCK inhibitor, Y27632. The performances of these candidate genes as stable reference genes were evaluated with two independent statistical approaches, geNorm and NormFinder. Conclusions The ribosomal protein genes, RPL19 and RPL29, were identified as suitable reference genes

  18. Comprehensive and accurate mutation scanning of the CFTR gene by two-dimensional DNA electrophoresis

    Wu, Y; Hofstra, RMW; Scheffer, H; Uitterlinden, AG; Mullaart, E; Buys, CHCM; Vijg, J

    1996-01-01

    The large number of possible disease causing mutations in the 27 exons of the cystic fibrosis transmembrane conductance regulator (CFTR) gene has severely limited direct diagnosis of cystic fibrosis (CF) patients and carriers by mutation detection. Here we show that in principle testing for mutation

  19. Robust TLR4-induced gene expression patterns are not an accurate indicator of human immunity

    Davidson Donald J

    2010-01-01

    Full Text Available Abstract Background Activation of Toll-like receptors (TLRs is widely accepted as an essential event for defence against infection. Many TLRs utilize a common signalling pathway that relies on activation of the kinase IRAK4 and the transcription factor NFκB for the rapid expression of immunity genes. Methods 21 K DNA microarray technology was used to evaluate LPS-induced (TLR4 gene responses in blood monocytes from a child with an IRAK4-deficiency. In vitro responsiveness to LPS was confirmed by real-time PCR and ELISA and compared to the clinical predisposition of the child and IRAK4-deficient mice to Gram negative infection. Results We demonstrated that the vast majority of LPS-responsive genes in IRAK4-deficient monocytes were greatly suppressed, an observation that is consistent with the described role for IRAK4 as an essential component of TLR4 signalling. The severely impaired response to LPS, however, is inconsistent with a remarkably low incidence of Gram negative infections observed in this child and other children with IRAK4-deficiency. This unpredicted clinical phenotype was validated by demonstrating that IRAK4-deficient mice had a similar resistance to infection with Gram negative S. typhimurium as wildtype mice. A number of immunity genes, such as chemokines, were expressed at normal levels in human IRAK4-deficient monocytes, indicating that particular IRAK4-independent elements within the repertoire of TLR4-induced responses are expressed. Conclusions Sufficient defence to Gram negative immunity does not require IRAK4 or a robust, 'classic' inflammatory and immune response.

  20. Validation of reference genes for accurate normalization of gene expression for real time-quantitative PCR in strawberry fruits using different cultivars and osmotic stresses.

    Galli, Vanessa; Borowski, Joyce Moura; Perin, Ellen Cristina; Messias, Rafael da Silva; Labonde, Julia; Pereira, Ivan dos Santos; Silva, Sérgio Delmar Dos Anjos; Rombaldi, Cesar Valmor

    2015-01-10

    The increasing demand of strawberry (Fragaria×ananassa Duch) fruits is associated mainly with their sensorial characteristics and the content of antioxidant compounds. Nevertheless, the strawberry production has been hampered due to its sensitivity to abiotic stresses. Therefore, to understand the molecular mechanisms highlighting stress response is of great importance to enable genetic engineering approaches aiming to improve strawberry tolerance. However, the study of expression of genes in strawberry requires the use of suitable reference genes. In the present study, seven traditional and novel candidate reference genes were evaluated for transcript normalization in fruits of ten strawberry cultivars and two abiotic stresses, using RefFinder, which integrates the four major currently available software programs: geNorm, NormFinder, BestKeeper and the comparative delta-Ct method. The results indicate that the expression stability is dependent on the experimental conditions. The candidate reference gene DBP (DNA binding protein) was considered the most suitable to normalize expression data in samples of strawberry cultivars and under drought stress condition, and the candidate reference gene HISTH4 (histone H4) was the most stable under osmotic stresses and salt stress. The traditional genes GAPDH (glyceraldehyde-3-phosphate dehydrogenase) and 18S (18S ribosomal RNA) were considered the most unstable genes in all conditions. The expression of phenylalanine ammonia lyase (PAL) and 9-cis epoxycarotenoid dioxygenase (NCED1) genes were used to further confirm the validated candidate reference genes, showing that the use of an inappropriate reference gene may induce erroneous results. This study is the first survey on the stability of reference genes in strawberry cultivars and osmotic stresses and provides guidelines to obtain more accurate RT-qPCR results for future breeding efforts. PMID:25445290

  1. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    Alves-Ferreira Marcio; Grossi-de-Sa Maria; Brilhante Osmundo; Nardeli Sarah M; Artico Sinara

    2010-01-01

    Abstract Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR). Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on i...

  2. Selection of suitable reference genes for accurate normalization of gene expression profile studies in non-small cell lung cancer

    In real-time RT quantitative PCR (qPCR) the accuracy of normalized data is highly dependent on the reliability of the reference genes (RGs). Failure to use an appropriate control gene for normalization of qPCR data may result in biased gene expression profiles, as well as low precision, so that only gross changes in expression level are declared statistically significant or patterns of expression are erroneously characterized. Therefore, it is essential to determine whether potential RGs are appropriate for specific experimental purposes. Aim of this study was to identify and validate RGs for use in the differentiation of normal and tumor lung expression profiles. A meta-analysis of lung cancer transcription profiles generated with the GeneChip technology was used to identify five putative RGs. Their consistency and that of seven commonly used RGs was tested by using Taqman probes on 18 paired normal-tumor lung snap-frozen specimens obtained from non-small-cell lung cancer (NSCLC) patients during primary curative resection. The 12 RGs displayed showed a wide range of Ct values: except for rRNA18S (mean 9.8), the mean values of all the commercial RGs and ESD ranged from 19 to 26, whereas those of the microarray-selected RGs (BTF-3, YAP1, HIST1H2BC, RPL30) exceeded 26. RG expression stability within sample populations and under the experimental conditions (tumour versus normal lung specimens) was evaluated by: (1) descriptive statistic; (2) equivalence test; (3) GeNorm applet. All these approaches indicated that the most stable RGs were POLR2A, rRNA18S, YAP1 and ESD. These data suggest that POLR2A, rRNA18S, YAP1 and ESD are the most suitable RGs for gene expression profile studies in NSCLC. Furthermore, they highlight the limitations of commercial RGs and indicate that meta-data analysis of genome-wide transcription profiling studies may identify new RGs

  3. Selection of suitable reference genes for accurate normalization of gene expression profile studies in non-small cell lung cancer

    Novello Silvia

    2006-07-01

    Full Text Available Abstract Background In real-time RT quantitative PCR (qPCR the accuracy of normalized data is highly dependent on the reliability of the reference genes (RGs. Failure to use an appropriate control gene for normalization of qPCR data may result in biased gene expression profiles, as well as low precision, so that only gross changes in expression level are declared statistically significant or patterns of expression are erroneously characterized. Therefore, it is essential to determine whether potential RGs are appropriate for specific experimental purposes. Aim of this study was to identify and validate RGs for use in the differentiation of normal and tumor lung expression profiles. Methods A meta-analysis of lung cancer transcription profiles generated with the GeneChip technology was used to identify five putative RGs. Their consistency and that of seven commonly used RGs was tested by using Taqman probes on 18 paired normal-tumor lung snap-frozen specimens obtained from non-small-cell lung cancer (NSCLC patients during primary curative resection. Results The 12 RGs displayed showed a wide range of Ct values: except for rRNA18S (mean 9.8, the mean values of all the commercial RGs and ESD ranged from 19 to 26, whereas those of the microarray-selected RGs (BTF-3, YAP1, HIST1H2BC, RPL30 exceeded 26. RG expression stability within sample populations and under the experimental conditions (tumour versus normal lung specimens was evaluated by: (1 descriptive statistic; (2 equivalence test; (3 GeNorm applet. All these approaches indicated that the most stable RGs were POLR2A, rRNA18S, YAP1 and ESD. Conclusion These data suggest that POLR2A, rRNA18S, YAP1 and ESD are the most suitable RGs for gene expression profile studies in NSCLC. Furthermore, they highlight the limitations of commercial RGs and indicate that meta-data analysis of genome-wide transcription profiling studies may identify new RGs.

  4. VICMpred: An SVM-based Method for the Prediction of Functional Proteins of Gram-negative Bacteria Using Amino Acid Patterns and Composition

    Sudipto Saha; G.P.S. Raghava

    2006-01-01

    In this study, an attempt has been made to predict the major functions of gramnegative bacterial proteins from their amino acid sequences. The dataset used for training and testing consists of 670 non-redundant gram-negative bacterial proteins (255 ofcellular process, 60 of information molecules, 285 of metabolism, and 70 of virulence factors). First we developed an SVM-based method using amino acid and dipeptide composition and achieved the overall accuracy of 52.39% and 47.01%, respectively. We introduced a new concept for the classification of proteins based on tetrapeptides, in which we identified the unique tetrapeptides significantly found in a class of proteins. These tetrapeptides were used as the input feature for predicting the function of a protein and achieved the overall accuracy of 68.66%. We also developed a hybrid method in which the tetrapeptide information was used with amino acid composition and achieved the overall accuracy of 70.75%. A five-fold cross validation was used to evaluate the performance of these methods. The web server VICMpred has been developed for predicting the function of gram-negative bacterial proteins (http://www.imtech.res.in/raghava/vicmpred/).

  5. An SVM-Based Classifier for Estimating the State of Various Rotating Components in Agro-Industrial Machinery with a Vibration Signal Acquired from a Single Point on the Machine Chassis

    Ruben Ruiz-Gonzalez; Jaime Gomez-Gil; Francisco Javier Gomez-Gil; Víctor Martínez-Martínez

    2014-01-01

    The goal of this article is to assess the feasibility of estimating the state of various rotating components in agro-industrial machinery by employing just one vibration signal acquired from a single point on the machine chassis. To do so, a Support Vector Machine (SVM)-based system is employed. Experimental tests evaluated this system by acquiring vibration data from a single point of an agricultural harvester, while varying several of its working conditions. The whole process included two m...

  6. Selection of accurate reference genes in mouse trophoblast stem cells for reverse transcription-quantitative polymerase chain reaction.

    Motomura, Kaori; Inoue, Kimiko; Ogura, Atsuo

    2016-06-17

    Mouse trophoblast stem cells (TSCs) form colonies of different sizes and morphologies, which might reflect their degrees of differentiation. Therefore, each colony type can have a characteristic gene expression profile; however, the expression levels of internal reference genes may also change, causing fluctuations in their estimated gene expression levels. In this study, we validated seven housekeeping genes by using a geometric averaging method and identified Gapdh as the most stable gene across different colony types. Indeed, when Gapdh was used as the reference, expression levels of Elf5, a TSC marker gene, stringently classified TSC colonies into two groups: a high expression groups consisting of type 1 and 2 colonies, and a lower expression group consisting of type 3 and 4 colonies. This clustering was consistent with our putative classification of undifferentiated/differentiated colonies based on their time-dependent colony transitions. By contrast, use of an unstable reference gene (Rn18s) allowed no such clear classification. Cdx2, another TSC marker, did not show any significant colony type-specific expression pattern irrespective of the reference gene. Selection of stable reference genes for quantitative gene expression analysis might be critical, especially when cell lines consisting of heterogeneous cell populations are used. PMID:26853688

  7. Accurate breast cancer diagnosis through real-time PCR her-2 gene quantification using immunohistochemically-identified biopsies

    MENDOZA, GRETEL; PORTILLO, AMELIA; Olmos-Soto, Jorge

    2012-01-01

    her-2 gene amplification and its overexpression in breast cancer cells is directly associated with aggressive clinical behavior. The her-2 gene and its Her-2 protein have been utilized for disease diagnosis and as a predictive marker for treatment response to the antibody herceptin. Fluorescent in situ hybridization (FISH) and immunohistochemistry (IHC) are the most common FDA-approved methodologies involving gene and protein quantification, respectively. False positive or negative her-2/Her-...

  8. Identification and evaluation of new reference genes in Gossypium hirsutum for accurate normalization of real-time quantitative RT-PCR data

    Alves-Ferreira Marcio

    2010-03-01

    Full Text Available Abstract Background Normalizing through reference genes, or housekeeping genes, can make more accurate and reliable results from reverse transcription real-time quantitative polymerase chain reaction (qPCR. Recent studies have shown that no single housekeeping gene is universal for all experiments. Thus, suitable reference genes should be the first step of any qPCR analysis. Only a few studies on the identification of housekeeping gene have been carried on plants. Therefore qPCR studies on important crops such as cotton has been hampered by the lack of suitable reference genes. Results By the use of two distinct algorithms, implemented by geNorm and NormFinder, we have assessed the gene expression of nine candidate reference genes in cotton: GhACT4, GhEF1α5, GhFBX6, GhPP2A1, GhMZA, GhPTB, GhGAPC2, GhβTUB3 and GhUBQ14. The candidate reference genes were evaluated in 23 experimental samples consisting of six distinct plant organs, eight stages of flower development, four stages of fruit development and in flower verticils. The expression of GhPP2A1 and GhUBQ14 genes were the most stable across all samples and also when distinct plants organs are examined. GhACT4 and GhUBQ14 present more stable expression during flower development, GhACT4 and GhFBX6 in the floral verticils and GhMZA and GhPTB during fruit development. Our analysis provided the most suitable combination of reference genes for each experimental set tested as internal control for reliable qPCR data normalization. In addition, to illustrate the use of cotton reference genes we checked the expression of two cotton MADS-box genes in distinct plant and floral organs and also during flower development. Conclusion We have tested the expression stabilities of nine candidate genes in a set of 23 tissue samples from cotton plants divided into five different experimental sets. As a result of this evaluation, we recommend the use of GhUBQ14 and GhPP2A1 housekeeping genes as superior references

  9. Identification of a 251 gene expression signature that can accurately detect M. tuberculosis in patients with and without HIV co-infection.

    Noor Dawany

    Full Text Available BACKGROUND: Co-infection with tuberculosis (TB is the leading cause of death in HIV-infected individuals. However, diagnosis of TB, especially in the presence of an HIV co-infection, can be limiting due to the high inaccuracy associated with the use of conventional diagnostic methods. Here we report a gene signature that can identify a tuberculosis infection in patients co-infected with HIV as well as in the absence of HIV. METHODS: We analyzed global gene expression data from peripheral blood mononuclear cell (PBMC samples of patients that were either mono-infected with HIV or co-infected with HIV/TB and used support vector machines to identify a gene signature that can distinguish between the two classes. We then validated our results using publically available gene expression data from patients mono-infected with TB. RESULTS: Our analysis successfully identified a 251-gene signature that accurately distinguishes patients co-infected with HIV/TB from those infected with HIV only, with an overall accuracy of 81.4% (sensitivity = 76.2%, specificity = 86.4%. Furthermore, we show that our 251-gene signature can also accurately distinguish patients with active TB in the absence of an HIV infection from both patients with a latent TB infection and healthy controls (88.9-94.7% accuracy; 69.2-90% sensitivity and 90.3-100% specificity. We also demonstrate that the expression levels of the 251-gene signature diminish as a correlate of the length of TB treatment. CONCLUSIONS: A 251-gene signature is described to (a detect TB in the presence or absence of an HIV co-infection, and (b assess response to treatment following anti-TB therapy.

  10. A scalable and accurate targeted gene assembly tool (SAT-Assembler for next-generation sequencing data.

    Yuan Zhang

    2014-08-01

    Full Text Available Gene assembly, which recovers gene segments from short reads, is an important step in functional analysis of next-generation sequencing data. Lacking quality reference genomes, de novo assembly is commonly used for RNA-Seq data of non-model organisms and metagenomic data. However, heterogeneous sequence coverage caused by heterogeneous expression or species abundance, similarity between isoforms or homologous genes, and large data size all pose challenges to de novo assembly. As a result, existing assembly tools tend to output fragmented contigs or chimeric contigs, or have high memory footprint. In this work, we introduce a targeted gene assembly program SAT-Assembler, which aims to recover gene families of particular interest to biologists. It addresses the above challenges by conducting family-specific homology search, homology-guided overlap graph construction, and careful graph traversal. It can be applied to both RNA-Seq and metagenomic data. Our experimental results on an Arabidopsis RNA-Seq data set and two metagenomic data sets show that SAT-Assembler has smaller memory usage, comparable or better gene coverage, and lower chimera rate for assembling a set of genes from one or multiple pathways compared with other assembly tools. Moreover, the family-specific design and rapid homology search allow SAT-Assembler to be naturally compatible with parallel computing platforms. The source code of SAT-Assembler is available at https://sourceforge.net/projects/sat-assembler/. The data sets and experimental settings can be found in supplementary material.

  11. Integrative structural annotation of de novo RNA-Seq provides an accurate reference gene set of the enormous genome of the onion (Allium cepa L.)

    Kim, Seungill; Kim, Myung-Shin; Kim, Yong-Min; Yeom, Seon-In; Cheong, Kyeongchae; Kim, Ki-Tae; Jeon, Jongbum; Kim, Sunggil; Kim, Do-Sun; Sohn, Seong-Han; Lee, Yong-Hwan; Choi, Doil

    2014-01-01

    The onion (Allium cepa L.) is one of the most widely cultivated and consumed vegetable crops in the world. Although a considerable amount of onion transcriptome data has been deposited into public databases, the sequences of the protein-coding genes are not accurate enough to be used, owing to non-coding sequences intermixed with the coding sequences. We generated a high-quality, annotated onion transcriptome from de novo sequence assembly and intensive structural annotation using the integra...

  12. MiRPara: a SVM-based software tool for prediction of most probable microRNA coding regions in genome scale sequences

    2011-01-01

    Background MicroRNAs are a family of ~22 nt small RNAs that can regulate gene expression at the post-transcriptional level. Identification of these molecules and their targets can aid understanding of regulatory processes. Recently, HTS has become a common identification method but there are two major limitations associated with the technique. Firstly, the method has low efficiency, with typically less than 1 in 10,000 sequences representing miRNA reads and secondly the method preferentially targets highly expressed miRNAs. If sequences are available, computational methods can provide a screening step to investigate the value of an HTS study and aid interpretation of results. However, current methods can only predict miRNAs for short fragments and have usually been trained against small datasets which don't always reflect the diversity of these molecules. Results We have developed a software tool, miRPara, that predicts most probable mature miRNA coding regions from genome scale sequences in a species specific manner. We classified sequences from miRBase into animal, plant and overall categories and used a support vector machine to train three models based on an initial set of 77 parameters related to the physical properties of the pre-miRNA and its miRNAs. By applying parameter filtering we found a subset of ~25 parameters produced higher prediction ability compared to the full set. Our software achieves an accuracy of up to 80% against experimentally verified mature miRNAs, making it one of the most accurate methods available. Conclusions miRPara is an effective tool for locating miRNAs coding regions in genome sequences and can be used as a screening step prior to HTS experiments. It is available at http://www.whiov.ac.cn/bioinformatics/mirpara PMID:21504621

  13. Identification and validation of reference genes for accurate normalization of real-time quantitative PCR data in kiwifruit.

    Ferradás, Yolanda; Rey, Laura; Martínez, Óscar; Rey, Manuel; González, M Victoria

    2016-05-01

    Identification and validation of reference genes are required for the normalization of qPCR data. We studied the expression stability produced by eight primer pairs amplifying four common genes used as references for normalization. Samples representing different tissues, organs and developmental stages in kiwifruit (Actinidia chinensis var. deliciosa (A. Chev.) A. Chev.) were used. A total of 117 kiwifruit samples were divided into five sample sets (mature leaves, axillary buds, stigmatic arms, fruit flesh and seeds). All samples were also analysed as a single set. The expression stability of the candidate primer pairs was tested using three algorithms (geNorm, NormFinder and BestKeeper). The minimum number of reference genes necessary for normalization was also determined. A unique primer pair was selected for amplifying the 18S rRNA gene. The primer pair selected for amplifying the ACTIN gene was different depending on the sample set. 18S 2 and ACT 2 were the candidate primer pairs selected for normalization in the three sample sets (mature leaves, fruit flesh and stigmatic arms). 18S 2 and ACT 3 were the primer pairs selected for normalization in axillary buds. No primer pair could be selected for use as the reference for the seed sample set. The analysis of all samples in a single set did not produce the selection of any stably expressing primer pair. Considering data previously reported in the literature, we validated the selected primer pairs amplifying the FLOWERING LOCUS T gene for use in the normalization of gene expression in kiwifruit. PMID:26897117

  14. Single-Copy Green Fluorescent Protein Gene Fusions Allow Accurate Measurement of Salmonella Gene Expression In Vitro and during Infection of Mammalian Cells

    Hautefort, Isabelle; Proença, Maria José; Hinton, Jay C. D.

    2003-01-01

    We developed a reliable and flexible green fluorescent protein (GFP)-based system for measuring gene expression in individual bacterial cells. Until now, most systems have relied upon plasmid-borne gfp gene fusions, risking problems associated with plasmid instability. We show that a recently developed GFP variant, GFP+, is suitable for assessing bacterial gene expression. Various gfp+ transcriptional fusions were constructed and integrated as single copies into the chromosome of Salmonella e...

  15. ICGA-PSO-ELM approach for accurate multiclass cancer classification resulting in reduced gene sets in which genes encoding secreted proteins are highly represented.

    Saraswathi, Saras; Sundaram, Suresh; Sundararajan, Narasimhan; Zimmermann, Michael; Nilsen-Hamilton, Marit

    2011-01-01

    A combination of Integer-Coded Genetic Algorithm (ICGA) and Particle Swarm Optimization (PSO), coupled with the neural-network-based Extreme Learning Machine (ELM), is used for gene selection and cancer classification. ICGA is used with PSO-ELM to select an optimal set of genes, which is then used to build a classifier to develop an algorithm (ICGA_PSO_ELM) that can handle sparse data and sample imbalance. We evaluate the performance of ICGA-PSO-ELM and compare our results with existing methods in the literature. An investigation into the functions of the selected genes, using a systems biology approach, revealed that many of the identified genes are involved in cell signaling and proliferation. An analysis of these gene sets shows a larger representation of genes that encode secreted proteins than found in randomly selected gene sets. Secreted proteins constitute a major means by which cells interact with their surroundings. Mounting biological evidence has identified the tumor microenvironment as a critical factor that determines tumor survival and growth. Thus, the genes identified by this study that encode secreted proteins might provide important insights to the nature of the critical biological features in the microenvironment of each tumor type that allow these cells to thrive and proliferate. PMID:21233525

  16. Global cDNA Amplification Combined with Real-Time RT–PCR: Accurate Quantification of Multiple Human Potassium Channel Genes at the Single Cell Level

    Al-Taher, A.; Bashein, A.; Nolan, T.; Hollingsworth, M.; Brady, G

    2000-01-01

    We have developed a sensitive quantitative RT–PCR procedure suitable for the analysis of small samples, including single cells, and have used it to measure levels of potassium channel mRNAs in a panel of human tissues and small numbers of cells grown in culture. The method involves an initial global amplification of cDNA derived from all added polyadenylated mRNA followed by quantitative RT–PCR of individual genes using specific primers. In order to facilitate rapid and accurate processing of...

  17. Gene expression profiling of whole blood: Comparison of target preparation methods for accurate and reproducible microarray analysis

    Choi Dongseok

    2009-01-01

    Full Text Available Abstract Background Peripheral blood is an accessible and informative source of transcriptomal information for many human disease and pharmacogenomic studies. While there can be significant advantages to analyzing RNA isolated from whole blood, particularly in clinical studies, the preparation of samples for microarray analysis is complicated by the need to minimize artifacts associated with highly abundant globin RNA transcripts. The impact of globin RNA transcripts on expression profiling data can potentially be reduced by using RNA preparation and labeling methods that remove or block globin RNA during the microarray assay. We compared four different methods for preparing microarray hybridization targets from human whole blood collected in PAXGene tubes. Three of the methods utilized the Affymetrix one-cycle cDNA synthesis/in vitro transcription protocol but varied treatment of input RNA as follows: i. no treatment; ii. treatment with GLOBINclear; or iii. treatment with globin PNA oligos. In the fourth method cDNA targets were prepared with the Ovation amplification and labeling system. Results We find that microarray targets generated with labeling methods that reduce globin mRNA levels or minimize the impact of globin transcripts during hybridization detect more transcripts in the microarray assay compared with the standard Affymetrix method. Comparison of microarray results with quantitative PCR analysis of a panel of genes from the NF-kappa B pathway shows good correlation of transcript measurements produced with all four target preparation methods, although method-specific differences in overall correlation were observed. The impact of freezing blood collected in PAXGene tubes on data reproducibility was also examined. Expression profiles show little or no difference when RNA is extracted from either fresh or frozen blood samples. Conclusion RNA preparation and labeling methods designed to reduce the impact of globin mRNA transcripts can

  18. Accurate Profiling of Gene Expression and Alternative Polyadenylation with Whole Transcriptome Termini Site Sequencing (WTTS-Seq)

    Zhou, Xiang; Li, Rui; Michal, Jennifer J.; Wu, Xiao-Lin; Liu, Zhongzhen; Zhao, Hui; Xia, Yin; Du, Weiwei; Wildung, Mark R.; Pouchnik, Derek J.; Harland, Richard M.; Jiang, Zhihua

    2016-01-01

    Construction of next-generation sequencing (NGS) libraries involves RNA manipulation, which often creates noisy, biased, and artifactual data that contribute to errors in transcriptome analysis. In this study, a total of 19 whole transcriptome termini site sequencing (WTTS-seq) and seven RNA sequencing (RNA-seq) libraries were prepared from Xenopus tropicalis adult and embryo samples to determine the most effective library preparation method to maximize transcriptomics investigation. We strongly suggest that appropriate primers/adaptors are designed to inhibit amplification detours and that PCR overamplification is minimized to maximize transcriptome coverage. Furthermore, genome annotation must be improved so that missing data can be recovered. In addition, a complete understanding of sequencing platforms is critical to limit the formation of false-positive results. Technically, the WTTS-seq method enriches both poly(A)+ RNA and complementary DNA, adds 5′- and 3′-adaptors in one step, pursues strand sequencing and mapping, and profiles both gene expression and alternative polyadenylation (APA). Although RNA-seq is cost prohibitive, tends to produce false-positive results, and fails to detect APA diversity and dynamics, its combination with WTTS-seq is necessary to validate transcriptome-wide APA. PMID:27098915

  19. Accurate Profiling of Gene Expression and Alternative Polyadenylation with Whole Transcriptome Termini Site Sequencing (WTTS-Seq).

    Zhou, Xiang; Li, Rui; Michal, Jennifer J; Wu, Xiao-Lin; Liu, Zhongzhen; Zhao, Hui; Xia, Yin; Du, Weiwei; Wildung, Mark R; Pouchnik, Derek J; Harland, Richard M; Jiang, Zhihua

    2016-06-01

    Construction of next-generation sequencing (NGS) libraries involves RNA manipulation, which often creates noisy, biased, and artifactual data that contribute to errors in transcriptome analysis. In this study, a total of 19 whole transcriptome termini site sequencing (WTTS-seq) and seven RNA sequencing (RNA-seq) libraries were prepared from Xenopus tropicalis adult and embryo samples to determine the most effective library preparation method to maximize transcriptomics investigation. We strongly suggest that appropriate primers/adaptors are designed to inhibit amplification detours and that PCR overamplification is minimized to maximize transcriptome coverage. Furthermore, genome annotation must be improved so that missing data can be recovered. In addition, a complete understanding of sequencing platforms is critical to limit the formation of false-positive results. Technically, the WTTS-seq method enriches both poly(A)+ RNA and complementary DNA, adds 5'- and 3'-adaptors in one step, pursues strand sequencing and mapping, and profiles both gene expression and alternative polyadenylation (APA). Although RNA-seq is cost prohibitive, tends to produce false-positive results, and fails to detect APA diversity and dynamics, its combination with WTTS-seq is necessary to validate transcriptome-wide APA. PMID:27098915

  20. Accurate binning of metagenomic contigs via automated clustering sequences using information of genomic signatures and marker genes.

    Lin, Hsin-Hung; Liao, Yu-Chieh

    2016-01-01

    Metagenomics, the application of shotgun sequencing, facilitates the reconstruction of the genomes of individual species from natural environments. A major challenge in the genome recovery domain is to agglomerate or 'bin' sequences assembled from metagenomic reads into individual groups. Metagenomic binning without consideration of reference sequences enables the comprehensive discovery of new microbial organisms and aids in the microbial genome reconstruction process. Here we present MyCC, an automated binning tool that combines genomic signatures, marker genes and optional contig coverages within one or multiple samples, in order to visualize the metagenomes and to identify the reconstructed genomic fragments. We demonstrate the superior performance of MyCC compared to other binning tools including CONCOCT, GroopM, MaxBin and MetaBAT on both synthetic and real human gut communities with a small sample size (one to 11 samples), as well as on a large metagenome dataset (over 250 samples). Moreover, we demonstrate the visualization of metagenomes in MyCC to aid in the reconstruction of genomes from distinct bins. MyCC is freely available at http://sourceforge.net/projects/sb2nhri/files/MyCC/. PMID:27067514

  1. 一种基于 QBC 的 SVM 主动学习算法%Active learning algorithm for SVM based on QBC

    徐海龙; 别晓峰; 冯卉; 吴天爱

    2015-01-01

    To the problem that large-scale labeled samples is not easy to acquire and the class-unbalanced dataset in the course of souport vector machine (SVM)training,an active learning algorithm based on query by committee (QBC)for SVM(QBC-ASVM)is proposed,which efficiently combines the improved QBC active learning and the weighted SVM.In this method,QBC active learning is used to select the samples which are the most valuable to the current SVM classifier,and the weighted SVM is used to reduce the impact of the unba-lanced data set on SVMs active learning.The experimental results show that the proposed approach can consid-erably reduce the labeled samples and costs compared with the passive SVM,and at the same time,it can ensure that the accurate classification performance is kept as the passive SVM,and the proposed method improves gen-eralization performance and also expedites the SVM training.%针对支持向量机(souport vector machine,SVM)训练学习过程中样本分布不均衡、难以获得大量带有类标注样本的问题,提出一种基于委员会投票选择(query by committee,QBC)的 SVM 主动学习算法 QBC-AS-VM,将改进的 QBC 主动学习方法与加权 SVM 方法有机地结合应用于 SVM 训练学习中,通过改进的 QBC 主动学习,主动选择那些对当前 SVM 分类器最有价值的样本进行标注,在 SVM 主动学习中应用改进的加权 SVM,减少了样本分布不均衡对 SVM 主动学习性能的影响,实验结果表明在保证不影响分类精度的情况下,所提出的算法需要标记的样本数量大大少于随机采样法需要标记的样本数量,降低了学习的样本标记代价,提高了 SVM 泛化性能而且训练速度同样有所提高。

  2. The detection of the methylated Wif-1 gene is more accurate than a fecal occult blood test for colorectal cancer screening

    Amiot, Aurelien

    2014-07-15

    Background: The clinical benefit of guaiac fecal occult blood tests (FOBT) is now well established for colorectal cancer screening. Growing evidence has demonstrated that epigenetic modifications and fecal microbiota changes, also known as dysbiosis, are associated with CRC pathogenesis and might be used as surrogate markers of CRC. Patients and Methods: We performed a cross-sectional study that included all consecutive subjects that were referred (from 2003 to 2007) for screening colonoscopies. Prior to colonoscopy, effluents (fresh stools, sera-S and urine-U) were harvested and FOBTs performed. Methylation levels were measured in stools, S and U for 3 genes (Wif1, ALX-4, and Vimentin) selected from a panel of 63 genes; Kras mutations and seven dominant and subdominant bacterial populations in stools were quantified. Calibration was assessed with the Hosmer-Lemeshow chi-square, and discrimination was determined by calculating the C-statistic (Area Under Curve) and Net Reclassification Improvement index. Results: There were 247 individuals (mean age 60.8±12.4 years, 52% of males) in the study group, and 90 (36%) of these individuals were patients with advanced polyps or invasive adenocarcinomas. A multivariate model adjusted for age and FOBT led to a C-statistic of 0.83 [0.77-0.88]. After supplementary sequential (one-by-one) adjustment, Wif-1 methylation (S or U) and fecal microbiota dysbiosis led to increases of the C-statistic to 0.90 [0.84-0.94] (p = 0.02) and 0.81 [0.74-0.86] (p = 0.49), respectively. When adjusted jointly for FOBT and Wif-1 methylation or fecal microbiota dysbiosis, the increase of the C-statistic was even more significant (0.91 and 0.85, p<0.001 and p = 0.10, respectively). Conclusion: The detection of methylated Wif-1 in either S or U has a higher performance accuracy compared to guaiac FOBT for advanced colorectal neoplasia screening. Conversely, fecal microbiota dysbiosis detection was not more accurate. Blood and urine testing could be

  3. The detection of the methylated Wif-1 gene is more accurate than a fecal occult blood test for colorectal cancer screening.

    Aurelien Amiot

    Full Text Available The clinical benefit of guaiac fecal occult blood tests (FOBT is now well established for colorectal cancer screening. Growing evidence has demonstrated that epigenetic modifications and fecal microbiota changes, also known as dysbiosis, are associated with CRC pathogenesis and might be used as surrogate markers of CRC.We performed a cross-sectional study that included all consecutive subjects that were referred (from 2003 to 2007 for screening colonoscopies. Prior to colonoscopy, effluents (fresh stools, sera-S and urine-U were harvested and FOBTs performed. Methylation levels were measured in stools, S and U for 3 genes (Wif1, ALX-4, and Vimentin selected from a panel of 63 genes; Kras mutations and seven dominant and subdominant bacterial populations in stools were quantified. Calibration was assessed with the Hosmer-Lemeshow chi-square, and discrimination was determined by calculating the C-statistic (Area Under Curve and Net Reclassification Improvement index.There were 247 individuals (mean age 60.8±12.4 years, 52% of males in the study group, and 90 (36% of these individuals were patients with advanced polyps or invasive adenocarcinomas. A multivariate model adjusted for age and FOBT led to a C-statistic of 0.83 [0.77-0.88]. After supplementary sequential (one-by-one adjustment, Wif-1 methylation (S or U and fecal microbiota dysbiosis led to increases of the C-statistic to 0.90 [0.84-0.94] (p = 0.02 and 0.81 [0.74-0.86] (p = 0.49, respectively. When adjusted jointly for FOBT and Wif-1 methylation or fecal microbiota dysbiosis, the increase of the C-statistic was even more significant (0.91 and 0.85, p<0.001 and p = 0.10, respectively.The detection of methylated Wif-1 in either S or U has a higher performance accuracy compared to guaiac FOBT for advanced colorectal neoplasia screening. Conversely, fecal microbiota dysbiosis detection was not more accurate. Blood and urine testing could be used in those individuals reluctant to

  4. The Rice Enhancer of Zeste [E(z)] Genes SDG711 and SDG718 are respectively involved in Long Day and Short Day Signaling to Mediate the Accurate Photoperiod Control of Flowering time

    Xiaoyun eLiu; Chao eZhou; Yu eZhao; Shaoli eZhou; Wentao eWang; Dao-Xiu eZhou

    2014-01-01

    Recent advances in rice flowering studies have shown that the accurate control of flowering by photoperiod is regulated by key mechanisms that involve the regulation of flowering genes including Hd1, Ehd1, Hd3a, and RFT1. The chromatin mechanism involved in the regulation of rice flowering genes is presently not well known. Here we show that the rice E(z) genes SDG711 and SDG718, which encode the Polycomb Repressive Complex2 (PRC2) key subunit that is required for trimethylation of histone H3...

  5. The rice enhancer of zeste [E(z)] genes SDG711 and SDG718 are respectively involved in long day and short day signaling to mediate the accurate photoperiod control of flowering time

    Liu, Xiaoyun; Zhou, Chao; Zhao, Yu; Zhou, Shaoli; Wang, Wentao; Zhou, Dao-Xiu

    2014-01-01

    Recent advances in rice flowering studies have shown that the accurate control of flowering by photoperiod is regulated by key mechanisms that involve the regulation of flowering genes including Heading date1 (Hd1), Early hd1 (Ehd1), Hd3a, and RFT1. The chromatin mechanism involved in the regulation of rice flowering genes is presently not well known. Here we show that the rice enhancer of zeste [E(z)] genes SDG711 and SDG718, which encode the polycomb repressive complex2 (PRC2) key subunit t...

  6. The effect of multiple primer-template mismatches on quantitative PCR accuracy and development of a multi-primer set assay for accurate quantification of pcrA gene sequence variants.

    Ledeker, Brett M; De Long, Susan K

    2013-09-01

    Quantitative PCR (qPCR) is a critical tool for quantifying the abundance of specific organisms and the level or expression of target genes in medically and environmentally relevant systems. However, often the power of this tool has been limited because primer-template mismatches, due to sequence variations of targeted genes, can lead to inaccuracies in measured gene quantities, detection failures, and spurious conclusions. Currently available primer design guidelines for qPCR were developed for pure culture applications, and available primer design strategies for mixed cultures were developed for detection rather than accurate quantification. Furthermore, past studies examining the impact of mismatches have focused only on single mismatches while instances of multiple mismatches are common. There are currently no appropriate solutions to overcome the challenges posed by sequence variations. Here, we report results that provide a comprehensive, quantitative understanding of the impact of multiple primer-template mismatches on qPCR accuracy and demonstrate a multi-primer set approach to accurately quantify a model gene pcrA (encoding perchlorate reductase) that has substantial sequence variation. Results showed that for multiple mismatches (up to 3 mismatches) in primer regions where mismatches were previously considered tolerable (middle and 5' end), quantification accuracies could be as low as ~0.1%. Furthermore, tests were run using a published pcrA primer set with mixtures of genomic DNA from strains known to harbor the target gene, and for some mixtures quantification accuracy was as low as ~0.8% or was non-detect. To overcome these limitations, a multiple primer set assay including minimal degeneracies was developed for pcrA genes. This assay resulted in nearly 100% accurate detection for all mixed microbial communities tested. The multi-primer set approach demonstrated herein can be broadly applied to other genes with known sequences. PMID:23806694

  7. MotorPlex provides accurate variant detection across large muscle genes both in single myopathic patients and in pools of DNA samples

    Savarese, Marco; Di Fruscio, Giuseppina; Mutarelli, Margherita; Torella, Annalaura; Magri, Francesca; Santorelli, Filippo Maria; Comi, Giacomo Pietro; Bruno, Claudio; Nigro, Vincenzo

    2014-01-01

    Mutations in ~100 genes cause muscle diseases with complex and often unexplained genotype/phenotype correlations. Next-generation sequencing studies identify a greater-than-expected number of genetic variations in the human genome. This suggests that existing clinical monogenic testing systematically miss very relevant information. We have created a core panel of genes that cause all known forms of nonsyndromic muscle disorders (MotorPlex). It comprises 93 loci, among which are the largest an...

  8. Immediate-Early Gene Transcriptional Activation in Hippocampus Ca1 and Ca3 Does Not Accurately Reflect Rapid, Pattern Completion-Based Retrieval of Context Memory

    Pevzner, Aleksandr; Guzowski, John F.

    2015-01-01

    No studies to date have examined whether immediate-early gene (IEG) activation is driven by context memory recall. To address this question, we utilized the context preexposure facilitation effect (CPFE) paradigm. In CPFE, animals acquire contextual fear conditioning through hippocampus-dependent rapid retrieval of a previously formed contextual…

  9. Genes optimized by evolution for accurate and fast translation encode in Archaea and Bacteria a broad and characteristic spectrum of protein functions

    Merkl Rainer

    2010-11-01

    Full Text Available Abstract Background In many microbial genomes, a strong preference for a small number of codons can be observed in genes whose products are needed by the cell in large quantities. This codon usage bias (CUB improves translational accuracy and speed and is one of several factors optimizing cell growth. Whereas CUB and the overrepresentation of individual proteins have been studied in detail, it is still unclear which high-level metabolic categories are subject to translational optimization in different habitats. Results In a systematic study of 388 microbial species, we have identified for each genome a specific subset of genes characterized by a marked CUB, which we named the effectome. As expected, gene products related to protein synthesis are abundant in both archaeal and bacterial effectomes. In addition, enzymes contributing to energy production and gene products involved in protein folding and stabilization are overrepresented. The comparison of genomes from eleven habitats shows that the environment has only a minor effect on the composition of the effectomes. As a paradigmatic example, we detailed the effectome content of 37 bacterial genomes that are most likely exposed to strongest selective pressure towards translational optimization. These effectomes accommodate a broad range of protein functions like enzymes related to glycolysis/gluconeogenesis and the TCA cycle, ATP synthases, aminoacyl-tRNA synthetases, chaperones, proteases that degrade misfolded proteins, protectants against oxidative damage, as well as cold shock and outer membrane proteins. Conclusions We made clear that effectomes consist of specific subsets of the proteome being involved in several cellular functions. As expected, some functions are related to cell growth and affect speed and quality of protein synthesis. Additionally, the effectomes contain enzymes of central metabolic pathways and cellular functions sustaining microbial life under stress situations. These

  10. Accurate Finite Difference Algorithms

    Goodrich, John W.

    1996-01-01

    Two families of finite difference algorithms for computational aeroacoustics are presented and compared. All of the algorithms are single step explicit methods, they have the same order of accuracy in both space and time, with examples up to eleventh order, and they have multidimensional extensions. One of the algorithm families has spectral like high resolution. Propagation with high order and high resolution algorithms can produce accurate results after O(10(exp 6)) periods of propagation with eight grid points per wavelength.

  11. A NEW SVM BASED EMOTIONAL CLASSIFICATION OF IMAGE

    Wang Weining; Yu Yinglin; Zhang Jianchao

    2005-01-01

    How high-level emotional representation of art paintings can be inferred from percep tual level features suited for the particular classes (dynamic vs. static classification)is presented. The key points are feature selection and classification. According to the strong relationship between notable lines of image and human sensations, a novel feature vector WLDLV (Weighted Line Direction-Length Vector) is proposed, which includes both orientation and length information of lines in an image. Classification is performed by SVM (Support Vector Machine) and images can be classified into dynamic and static. Experimental results demonstrate the effectiveness and superiority of the algorithm.

  12. Oil spill detection from SAR image using SVM based classification

    A. A. Matkan

    2013-09-01

    Full Text Available In this paper, the potential of fully polarimetric L-band SAR data for detecting sea oil spills is investigated using polarimetric decompositions and texture analysis based on SVM classifier. First, power and magnitude measurements of HH and VV polarization modes and, Pauli, Freeman and Krogager decompositions are computed and applied in SVM classifier. Texture analysis is used for identification using SVM method. The texture features i.e. Mean, Variance, Contrast and Dissimilarity from them are then extracted. Experiments are conducted on full polarimetric SAR data acquired from PALSAR sensor of ALOS satellite on August 25, 2006. An accuracy assessment indicated overall accuracy of 78.92% and 96.46% for the power measurement of the VV polarization and the Krogager decomposition respectively in first step. But by use of texture analysis the results are improved to 96.44% and 96.65% quality for mean of power and magnitude measurements of HH and VV polarizations and the Krogager decomposition. Results show that the Krogager polarimetric decomposition method has the satisfying result for detection of sea oil spill on the sea surface and the texture analysis presents the good results.

  13. Automatic Parameters Selection for SVM Based on PSO

    ZHANG Mingfeng; ZHU Yinghua; ZHENG Xu; LIU Yu

    2007-01-01

    Motivated by the fact that automatic parameters selection for Support Vector Machine (SVM) is an important issue to make SVM practically useful and the common used Leave-One-Out (LOO) method is complex calculation and time consuming,an effective strategy for automatic parameters selection for SVM is proposed by using the Particle Swarm Optimization (PSO) in this paper.Simulation results of practice data model demonstrate the effectiveness and high efficiency of the proposed approach.

  14. Incremental Training for SVM-Based Classification with Keyword Adjusting

    SUN Jin-wen; YANG Jian-wu; LU Bin; XIAO Jian-guo

    2004-01-01

    This paper analyzed the theory of incremental learning of SVM (support vector machine) and pointed out it is a shortage that the support vector optimization is only considered in present research of SVM incremental learning.According to the significance of keyword in training, a new incremental training method considering keyword adjusting was proposed, which eliminates the difference between incremental learning and batch learning through the keyword adjusting.The experimental results show that the improved method outperforms the method without the keyword adjusting and achieve the same precision as the batch method.

  15. SVM-based prediction of caspase substrate cleavage sites

    Wee, Lawrence JK; Tan, Tin Wee; Ranganathan, Shoba

    2006-01-01

    Background Caspases belong to a class of cysteine proteases which function as critical effectors in apoptosis and inflammation by cleaving substrates immediately after unique sites. Prediction of such cleavage sites will complement structural and functional studies on substrates cleavage as well as discovery of new substrates. Recently, different computational methods have been developed to predict the cleavage sites of caspase substrates with varying degrees of success. As the support vector...

  16. Question Categorization Using SVM Based on Different Term Weighting Methods

    Priyanka G Pillai

    2012-05-01

    Full Text Available This paper deals with the performance of Question Categorization based on four different term weighting methods. Term weighting methods such as tf*idf, qf*icf, iqf*qf*icf and vrf together with SVM classifier were used for categorization. From the experiments conducted using both linear and nonlinear SVM, term weighting method iqf*qf*icf showed better performance in question categorization than other methods.

  17. Question Categorization Using SVM Based on Different Term Weighting Methods

    Priyanka G Pillai; Jayasree Narayanan

    2012-01-01

    This paper deals with the performance of Question Categorization based on four different term weighting methods. Term weighting methods such as tf*idf, qf*icf, iqf*qf*icf and vrf together with SVM classifier were used for categorization. From the experiments conducted using both linear and nonlinear SVM, term weighting method iqf*qf*icf showed better performance in question categorization than other methods.

  18. An SVM Based Approach for the Analysis Of Mammography Images

    Mammography is among the most popular imaging techniques used in the diagnosis of breast cancer. Nevertheless distinguishing between healthy and ill images is hard even for an experienced radiologist, because a single image usually includes several regions of interest (ROIs). The hardness of this classification problem along with the substantial amount of data, gathered from patients' medical history, motivates the use of a machine learning approach as part of a CAD (Computer Aided Detection) tool, aiming to assist radiologists in the characterization of mammography images. Specifically, our approach involves: i) the ROI extraction, ii) the Feature Vector extraction, iii) the Support Vector Machine (SVM) classification of ROIs and iv) the characterization of the whole image. We evaluate the performance of our approach in terms of the SVM's training and testing error and in terms of ROI specificity - sensitivity. The results show a relation between the number of features used and the SVM's performance

  19. Accurate molecular classification of cancer using simple rules

    Gotoh Osamu; Wang Xiaosheng

    2009-01-01

    Abstract Background One intractable problem with using microarray data analysis for cancer classification is how to reduce the extremely high-dimensionality gene feature data to remove the effects of noise. Feature selection is often used to address this problem by selecting informative genes from among thousands or tens of thousands of genes. However, most of the existing methods of microarray-based cancer classification utilize too many genes to achieve accurate classification, which often ...

  20. Gene

    U.S. Department of Health & Human Services — Gene integrates information from a wide range of species. A record may include nomenclature, Reference Sequences (RefSeqs), maps, pathways, variations, phenotypes,...

  1. Towards accurate emergency response behavior

    Nuclear reactor operator emergency response behavior has persisted as a training problem through lack of information. The industry needs an accurate definition of operator behavior in adverse stress conditions, and training methods which will produce the desired behavior. Newly assembled information from fifty years of research into human behavior in both high and low stress provides a more accurate definition of appropriate operator response, and supports training methods which will produce the needed control room behavior. The research indicates that operator response in emergencies is divided into two modes, conditioned behavior and knowledge based behavior. Methods which assure accurate conditioned behavior, and provide for the recovery of knowledge based behavior, are described in detail

  2. Accurate determination of antenna directivity

    Dich, Mikael

    1997-01-01

    The derivation of a formula for accurate estimation of the total radiated power from a transmitting antenna for which the radiated power density is known in a finite number of points on the far-field sphere is presented. The main application of the formula is determination of directivity from power...

  3. Circadian transitions in radiation dose-dependent augmentation of mRNA levels for DNA damage-induced genes elicited by accurate real-time RT-PCR quantification

    Molecular mechanisms of intracellular response after DNA-damage by exposure to ionizing radiation have been studied. In the case of cells isolated from living body of human and experimental animals, alteration of the responsiveness by physiological oscillation such as circadian rhythm must be considered. To examine the circadian variation in the response of p53-responsible genes p21, mdm2, bax, and puma, we established a method to quantitate their mRNA levels with high reproducibility and accuracy based on real-time reverse transcription polymerase chain reaction (RT-PCR) and compared the levels of responsiveness in mouse hemocytes after diurnal irradiation to that after nocturnal irradiation. Augmentations of p21 and mdm2 mRNA levels with growth-arrest and of puma mRNA before apoptosis were confirmed by time-course experiment in RAW264.7, and dose-dependent increases in the peak levels of all the RNA were shown. Similarly, the relative RNA levels of p21, mdm2, bax, and puma per glyceraldehyde-3-phosphate dehydrogenase (GAPDH) also increased dose-dependently in peripheral blood and bone marrow cells isolated from whole-body-irradiated mice. Induction levels of all messages reduced by half after nighttime irradiation as compared with daytime irradiation in blood cells. In marrow cells, nighttime irradiation enhanced the p21 and mdm2 mRNA levels than daytime irradiation. No significant difference in bax or puma mRNA levels was observed between nighttime and daytime irradiation in marrow cells. This suggests that early-stage cellular responsiveness in DNA damage-induced genes is modulated between diurnal and nocturnal irradiation. (author)

  4. Accurate molecular classification of cancer using simple rules

    Gotoh Osamu

    2009-10-01

    Full Text Available Abstract Background One intractable problem with using microarray data analysis for cancer classification is how to reduce the extremely high-dimensionality gene feature data to remove the effects of noise. Feature selection is often used to address this problem by selecting informative genes from among thousands or tens of thousands of genes. However, most of the existing methods of microarray-based cancer classification utilize too many genes to achieve accurate classification, which often hampers the interpretability of the models. For a better understanding of the classification results, it is desirable to develop simpler rule-based models with as few marker genes as possible. Methods We screened a small number of informative single genes and gene pairs on the basis of their depended degrees proposed in rough sets. Applying the decision rules induced by the selected genes or gene pairs, we constructed cancer classifiers. We tested the efficacy of the classifiers by leave-one-out cross-validation (LOOCV of training sets and classification of independent test sets. Results We applied our methods to five cancerous gene expression datasets: leukemia (acute lymphoblastic leukemia [ALL] vs. acute myeloid leukemia [AML], lung cancer, prostate cancer, breast cancer, and leukemia (ALL vs. mixed-lineage leukemia [MLL] vs. AML. Accurate classification outcomes were obtained by utilizing just one or two genes. Some genes that correlated closely with the pathogenesis of relevant cancers were identified. In terms of both classification performance and algorithm simplicity, our approach outperformed or at least matched existing methods. Conclusion In cancerous gene expression datasets, a small number of genes, even one or two if selected correctly, is capable of achieving an ideal cancer classification effect. This finding also means that very simple rules may perform well for cancerous class prediction.

  5. Accurate Modeling of Advanced Reflectarrays

    Zhou, Min

    of the incident field, the choice of basis functions, and the technique to calculate the far-field. Based on accurate reference measurements of two offset reflectarrays carried out at the DTU-ESA Spherical NearField Antenna Test Facility, it was concluded that the three latter factors are particularly important...... to the conventional phase-only optimization technique (POT), the geometrical parameters of the array elements are directly optimized to fulfill the far-field requirements, thus maintaining a direct relation between optimization goals and optimization variables. As a result, better designs can be obtained compared...... using the GDOT to demonstrate its capabilities. To verify the accuracy of the GDOT, two offset contoured beam reflectarrays that radiate a high-gain beam on a European coverage have been designed and manufactured, and subsequently measured at the DTU-ESA Spherical Near-Field Antenna Test Facility...

  6. Accurate ab initio spin densities

    Boguslawski, Katharina; Legeza, Örs; Reiher, Markus

    2012-01-01

    We present an approach for the calculation of spin density distributions for molecules that require very large active spaces for a qualitatively correct description of their electronic structure. Our approach is based on the density-matrix renormalization group (DMRG) algorithm to calculate the spin density matrix elements as basic quantity for the spatially resolved spin density distribution. The spin density matrix elements are directly determined from the second-quantized elementary operators optimized by the DMRG algorithm. As an analytic convergence criterion for the spin density distribution, we employ our recently developed sampling-reconstruction scheme [J. Chem. Phys. 2011, 134, 224101] to build an accurate complete-active-space configuration-interaction (CASCI) wave function from the optimized matrix product states. The spin density matrix elements can then also be determined as an expectation value employing the reconstructed wave function expansion. Furthermore, the explicit reconstruction of a CA...

  7. Accurate thickness measurement of graphene

    Shearer, Cameron J.; Slattery, Ashley D.; Stapleton, Andrew J.; Shapter, Joseph G.; Gibson, Christopher T.

    2016-03-01

    Graphene has emerged as a material with a vast variety of applications. The electronic, optical and mechanical properties of graphene are strongly influenced by the number of layers present in a sample. As a result, the dimensional characterization of graphene films is crucial, especially with the continued development of new synthesis methods and applications. A number of techniques exist to determine the thickness of graphene films including optical contrast, Raman scattering and scanning probe microscopy techniques. Atomic force microscopy (AFM), in particular, is used extensively since it provides three-dimensional images that enable the measurement of the lateral dimensions of graphene films as well as the thickness, and by extension the number of layers present. However, in the literature AFM has proven to be inaccurate with a wide range of measured values for single layer graphene thickness reported (between 0.4 and 1.7 nm). This discrepancy has been attributed to tip-surface interactions, image feedback settings and surface chemistry. In this work, we use standard and carbon nanotube modified AFM probes and a relatively new AFM imaging mode known as PeakForce tapping mode to establish a protocol that will allow users to accurately determine the thickness of graphene films. In particular, the error in measuring the first layer is reduced from 0.1-1.3 nm to 0.1-0.3 nm. Furthermore, in the process we establish that the graphene-substrate adsorbate layer and imaging force, in particular the pressure the tip exerts on the surface, are crucial components in the accurate measurement of graphene using AFM. These findings can be applied to other 2D materials.

  8. A More Accurate Fourier Transform

    Courtney, Elya

    2015-01-01

    Fourier transform methods are used to analyze functions and data sets to provide frequencies, amplitudes, and phases of underlying oscillatory components. Fast Fourier transform (FFT) methods offer speed advantages over evaluation of explicit integrals (EI) that define Fourier transforms. This paper compares frequency, amplitude, and phase accuracy of the two methods for well resolved peaks over a wide array of data sets including cosine series with and without random noise and a variety of physical data sets, including atmospheric $\\mathrm{CO_2}$ concentrations, tides, temperatures, sound waveforms, and atomic spectra. The FFT uses MIT's FFTW3 library. The EI method uses the rectangle method to compute the areas under the curve via complex math. Results support the hypothesis that EI methods are more accurate than FFT methods. Errors range from 5 to 10 times higher when determining peak frequency by FFT, 1.4 to 60 times higher for peak amplitude, and 6 to 10 times higher for phase under a peak. The ability t...

  9. 38 CFR 4.46 - Accurate measurement.

    2010-07-01

    ... 38 Pensions, Bonuses, and Veterans' Relief 1 2010-07-01 2010-07-01 false Accurate measurement. 4... RATING DISABILITIES Disability Ratings The Musculoskeletal System § 4.46 Accurate measurement. Accurate measurement of the length of stumps, excursion of joints, dimensions and location of scars with respect...

  10. Identification of Gene-Expression Signatures and Protein Markers for Breast Cancer Grading and Staging.

    Fang Yao

    Full Text Available The grade of a cancer is a measure of the cancer's malignancy level, and the stage of a cancer refers to the size and the extent that the cancer has spread. Here we present a computational method for prediction of gene signatures and blood/urine protein markers for breast cancer grades and stages based on RNA-seq data, which are retrieved from the TCGA breast cancer dataset and cover 111 pairs of disease and matching adjacent noncancerous tissues with pathologists-assigned stages and grades. By applying a differential expression and an SVM-based classification approach, we found that 324 and 227 genes in cancer have their expression levels consistently up-regulated vs. their matching controls in a grade- and stage-dependent manner, respectively. By using these genes, we predicted a 9-gene panel as a gene signature for distinguishing poorly differentiated from moderately and well differentiated breast cancers, and a 19-gene panel as a gene signature for discriminating between the moderately and well differentiated breast cancers. Similarly, a 30-gene panel and a 21-gene panel are predicted as gene signatures for distinguishing advanced stage (stages III-IV from early stage (stages I-II cancer samples and for distinguishing stage II from stage I samples, respectively. We expect these gene panels can be used as gene-expression signatures for cancer grade and stage classification. In addition, of the 324 grade-dependent genes, 188 and 66 encode proteins that are predicted to be blood-secretory and urine-excretory, respectively; and of the 227 stage-dependent genes, 123 and 51 encode proteins predicted to be blood-secretory and urine-excretory, respectively. We anticipate that some combinations of these blood and urine proteins could serve as markers for monitoring breast cancer at specific grades and stages through blood and urine tests.

  11. Laboratory Building for Accurate Determination of Plutonium

    2008-01-01

    <正>The accurate determination of plutonium is one of the most important assay techniques of nuclear fuel, also the key of the chemical measurement transfer and the base of the nuclear material balance. An

  12. Invariant Image Watermarking Using Accurate Zernike Moments

    Ismail A. Ismail

    2010-01-01

    Full Text Available problem statement: Digital image watermarking is the most popular method for image authentication, copyright protection and content description. Zernike moments are the most widely used moments in image processing and pattern recognition. The magnitudes of Zernike moments are rotation invariant so they can be used just as a watermark signal or be further modified to carry embedded data. The computed Zernike moments in Cartesian coordinate are not accurate due to geometrical and numerical error. Approach: In this study, we employed a robust image-watermarking algorithm using accurate Zernike moments. These moments are computed in polar coordinate, where both approximation and geometric errors are removed. Accurate Zernike moments are used in image watermarking and proved to be robust against different kind of geometric attacks. The performance of the proposed algorithm is evaluated using standard images. Results: Experimental results show that, accurate Zernike moments achieve higher degree of robustness than those approximated ones against rotation, scaling, flipping, shearing and affine transformation. Conclusion: By computing accurate Zernike moments, the embedded bits watermark can be extracted at low error rate.

  13. Accurate atomic data for industrial plasma applications

    Griesmann, U.; Bridges, J.M.; Roberts, J.R.; Wiese, W.L.; Fuhr, J.R. [National Inst. of Standards and Technology, Gaithersburg, MD (United States)

    1997-12-31

    Reliable branching fraction, transition probability and transition wavelength data for radiative dipole transitions of atoms and ions in plasma are important in many industrial applications. Optical plasma diagnostics and modeling of the radiation transport in electrical discharge plasmas (e.g. in electrical lighting) depend on accurate basic atomic data. NIST has an ongoing experimental research program to provide accurate atomic data for radiative transitions. The new NIST UV-vis-IR high resolution Fourier transform spectrometer has become an excellent tool for accurate and efficient measurements of numerous transition wavelengths and branching fractions in a wide wavelength range. Recently, the authors have also begun to employ photon counting techniques for very accurate measurements of branching fractions of weaker spectral lines with the intent to improve the overall accuracy for experimental branching fractions to better than 5%. They have now completed their studies of transition probabilities of Ne I and Ne II. The results agree well with recent calculations and for the first time provide reliable transition probabilities for many weak intercombination lines.

  14. More accurate picture of human body organs

    Computerized tomography and nucler magnetic resonance tomography (NMRT) are revolutionary contributions to radiodiagnosis because they allow to obtain a more accurate image of human body organs. The principles are described of both methods. Attention is mainly devoted to NMRT which has clinically only been used for three years. It does not burden the organism with ionizing radiation. (Ha)

  15. Feedback about more accurate versus less accurate trials: differential effects on self-confidence and activation.

    Badami, Rokhsareh; VaezMousavi, Mohammad; Wulf, Gabriele; Namazizadeh, Mahdi

    2012-06-01

    One purpose of the present study was to examine whether self-confidence or anxiety would be differentially affected byfeedback from more accurate rather than less accurate trials. The second purpose was to determine whether arousal variations (activation) would predict performance. On day 1, participants performed a golf putting task under one of two conditions: one group received feedback on the most accurate trials, whereas another group received feedback on the least accurate trials. On day 2, participants completed an anxiety questionnaire and performed a retention test. Shin conductance level, as a measure of arousal, was determined. The results indicated that feedback about more accurate trials resulted in more effective learning as well as increased self-confidence. Also, activation was a predictor of performance. PMID:22808705

  16. A gene expression biomarker accurately predicts estrogen receptor α modulation in a human gene expression compendium

    The EPA’s vision for the Endocrine Disruptor Screening Program (EDSP) in the 21st Century (EDSP21) includes utilization of high-throughput screening (HTS) assays coupled with computational modeling to prioritize chemicals with the goal of eventually replacing current Tier 1...

  17. Accurate guitar tuning by cochlear implant musicians.

    Thomas Lu

    Full Text Available Modern cochlear implant (CI users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task.

  18. Accurate guitar tuning by cochlear implant musicians.

    Lu, Thomas; Huang, Juan; Zeng, Fan-Gang

    2014-01-01

    Modern cochlear implant (CI) users understand speech but find difficulty in music appreciation due to poor pitch perception. Still, some deaf musicians continue to perform with their CI. Here we show unexpected results that CI musicians can reliably tune a guitar by CI alone and, under controlled conditions, match simultaneously presented tones to <0.5 Hz. One subject had normal contralateral hearing and produced more accurate tuning with CI than his normal ear. To understand these counterintuitive findings, we presented tones sequentially and found that tuning error was larger at ∼ 30 Hz for both subjects. A third subject, a non-musician CI user with normal contralateral hearing, showed similar trends in performance between CI and normal hearing ears but with less precision. This difference, along with electric analysis, showed that accurate tuning was achieved by listening to beats rather than discriminating pitch, effectively turning a spectral task into a temporal discrimination task. PMID:24651081

  19. How Accurate is inv(A)*b?

    Druinsky, Alex

    2012-01-01

    Several widely-used textbooks lead the reader to believe that solving a linear system of equations Ax = b by multiplying the vector b by a computed inverse inv(A) is inaccurate. Virtually all other textbooks on numerical analysis and numerical linear algebra advise against using computed inverses without stating whether this is accurate or not. In fact, under reasonable assumptions on how the inverse is computed, x = inv(A)*b is as accurate as the solution computed by the best backward-stable solvers. This fact is not new, but obviously obscure. We review the literature on the accuracy of this computation and present a self-contained numerical analysis of it.

  20. Accurate, reproducible measurement of blood pressure.

    Campbell, N. R.; Chockalingam, A; Fodor, J. G.; McKay, D. W.

    1990-01-01

    The diagnosis of mild hypertension and the treatment of hypertension require accurate measurement of blood pressure. Blood pressure readings are altered by various factors that influence the patient, the techniques used and the accuracy of the sphygmomanometer. The variability of readings can be reduced if informed patients prepare in advance by emptying their bladder and bowel, by avoiding over-the-counter vasoactive drugs the day of measurement and by avoiding exposure to cold, caffeine con...

  1. Accurate Finite Difference Methods for Option Pricing

    Persson, Jonas

    2006-01-01

    Stock options are priced numerically using space- and time-adaptive finite difference methods. European options on one and several underlying assets are considered. These are priced with adaptive numerical algorithms including a second order method and a more accurate method. For American options we use the adaptive technique to price options on one stock with and without stochastic volatility. In all these methods emphasis is put on the control of errors to fulfill predefined tolerance level...

  2. Towards accurate modeling of moving contact lines

    Holmgren, Hanna

    2015-01-01

    The present thesis treats the numerical simulation of immiscible incompressible two-phase flows with moving contact lines. The conventional Navier–Stokes equations combined with a no-slip boundary condition leads to a non-integrable stress singularity at the contact line. The singularity in the model can be avoided by allowing the contact line to slip. Implementing slip conditions in an accurate way is not straight-forward and different regularization techniques exist where ad-hoc procedures ...

  3. Accurate variational forms for multiskyrmion configurations

    Jackson, A.D.; Weiss, C.; Wirzba, A.; Lande, A.

    1989-04-17

    Simple variational forms are suggested for the fields of a single skyrmion on a hypersphere, S/sub 3/(L), and of a face-centered cubic array of skyrmions in flat space, R/sub 3/. The resulting energies are accurate at the level of 0.2%. These approximate field configurations provide a useful alternative to brute-force solutions of the corresponding Euler equations.

  4. Efficient Accurate Context-Sensitive Anomaly Detection

    2007-01-01

    For program behavior-based anomaly detection, the only way to ensure accurate monitoring is to construct an efficient and precise program behavior model. A new program behavior-based anomaly detection model,called combined pushdown automaton (CPDA) model was proposed, which is based on static binary executable analysis. The CPDA model incorporates the optimized call stack walk and code instrumentation technique to gain complete context information. Thereby the proposed method can detect more attacks, while retaining good performance.

  5. Accurate phase-shift velocimetry in rock

    Shukla, Matsyendra Nath; Vallatos, Antoine; Phoenix, Vernon R.; Holmes, William M.

    2016-06-01

    Spatially resolved Pulsed Field Gradient (PFG) velocimetry techniques can provide precious information concerning flow through opaque systems, including rocks. This velocimetry data is used to enhance flow models in a wide range of systems, from oil behaviour in reservoir rocks to contaminant transport in aquifers. Phase-shift velocimetry is the fastest way to produce velocity maps but critical issues have been reported when studying flow through rocks and porous media, leading to inaccurate results. Combining PFG measurements for flow through Bentheimer sandstone with simulations, we demonstrate that asymmetries in the molecular displacement distributions within each voxel are the main source of phase-shift velocimetry errors. We show that when flow-related average molecular displacements are negligible compared to self-diffusion ones, symmetric displacement distributions can be obtained while phase measurement noise is minimised. We elaborate a complete method for the production of accurate phase-shift velocimetry maps in rocks and low porosity media and demonstrate its validity for a range of flow rates. This development of accurate phase-shift velocimetry now enables more rapid and accurate velocity analysis, potentially helping to inform both industrial applications and theoretical models.

  6. Genes and Gene Therapy

    ... correctly, a child can have a genetic disorder. Gene therapy is an experimental technique that uses genes to ... or prevent disease. The most common form of gene therapy involves inserting a normal gene to replace an ...

  7. High Frequency QRS ECG Accurately Detects Cardiomyopathy

    Schlegel, Todd T.; Arenare, Brian; Poulin, Gregory; Moser, Daniel R.; Delgado, Reynolds

    2005-01-01

    High frequency (HF, 150-250 Hz) analysis over the entire QRS interval of the ECG is more sensitive than conventional ECG for detecting myocardial ischemia. However, the accuracy of HF QRS ECG for detecting cardiomyopathy is unknown. We obtained simultaneous resting conventional and HF QRS 12-lead ECGs in 66 patients with cardiomyopathy (EF = 23.2 plus or minus 6.l%, mean plus or minus SD) and in 66 age- and gender-matched healthy controls using PC-based ECG software recently developed at NASA. The single most accurate ECG parameter for detecting cardiomyopathy was an HF QRS morphological score that takes into consideration the total number and severity of reduced amplitude zones (RAZs) present plus the clustering of RAZs together in contiguous leads. This RAZ score had an area under the receiver operator curve (ROC) of 0.91, and was 88% sensitive, 82% specific and 85% accurate for identifying cardiomyopathy at optimum score cut-off of 140 points. Although conventional ECG parameters such as the QRS and QTc intervals were also significantly longer in patients than controls (P less than 0.001, BBBs excluded), these conventional parameters were less accurate (area under the ROC = 0.77 and 0.77, respectively) than HF QRS morphological parameters for identifying underlying cardiomyopathy. The total amplitude of the HF QRS complexes, as measured by summed root mean square voltages (RMSVs), also differed between patients and controls (33.8 plus or minus 11.5 vs. 41.5 plus or minus 13.6 mV, respectively, P less than 0.003), but this parameter was even less accurate in distinguishing the two groups (area under ROC = 0.67) than the HF QRS morphologic and conventional ECG parameters. Diagnostic accuracy was optimal (86%) when the RAZ score from the HF QRS ECG and the QTc interval from the conventional ECG were used simultaneously with cut-offs of greater than or equal to 40 points and greater than or equal to 445 ms, respectively. In conclusion 12-lead HF QRS ECG employing

  8. Performance Analysis of a DTC and SVM Based Field-Orientation Control Induction Motor Drive

    Md. Rashedul Islam; Pintu Kumar Sadhu; Md. Maruful Islam; Md. Kamal Hossain

    2015-01-01

    This study presents a performance analysis of two most popular control strategies for Induction Motor (IM) drives: direct torque control (DTC) and space vector modulation (SVM) strategies. The performance analysis is done by applying field-orientation control (FOC) technique because of its good dynamic response. The theoretical principle, simulation results are discussed to study the dynamic performances of the drive system for individual control strategies using actual parameters of inductio...

  9. Performance Analysis of a DTC and SVM Based Field-Orientation Control Induction Motor Drive

    Md. Rashedul Islam

    2015-02-01

    Full Text Available This study presents a performance analysis of two most popular control strategies for Induction Motor (IM drives: direct torque control (DTC and space vector modulation (SVM strategies. The performance analysis is done by applying field-orientation control (FOC technique because of its good dynamic response. The theoretical principle, simulation results are discussed to study the dynamic performances of the drive system for individual control strategies using actual parameters of induction motor. A closed loop PI controller scheme has been used. The main purpose of this study is to minimize ripple in torque response curve and to achieve quick speed response as well as to investigate the condition for optimum performance of induction motor drive. Depending on the simulation results this study also presents a detailed comparison between direct torque control and space vector modulation based field-orientation control method for the induction motor drive.

  10. SVM-based spectrum mobility prediction scheme in mobile cognitive radio networks.

    Wang, Yao; Zhang, Zhongzhao; Ma, Lin; Chen, Jiamei

    2014-01-01

    Spectrum mobility as an essential issue has not been fully investigated in mobile cognitive radio networks (CRNs). In this paper, a novel support vector machine based spectrum mobility prediction (SVM-SMP) scheme is presented considering time-varying and space-varying characteristics simultaneously in mobile CRNs. The mobility of cognitive users (CUs) and the working activities of primary users (PUs) are analyzed in theory. And a joint feature vector extraction (JFVE) method is proposed based on the theoretical analysis. Then spectrum mobility prediction is executed through the classification of SVM with a fast convergence speed. Numerical results validate that SVM-SMP gains better short-time prediction accuracy rate and miss prediction rate performance than the two algorithms just depending on the location and speed information. Additionally, a rational parameter design can remedy the prediction performance degradation caused by high speed SUs with strong randomness movements. PMID:25143975

  11. SVM-based classification of LV wall motion in cardiac MRI with the assessment of STE

    Mantilla, Juan; Garreau, Mireille; Bellanger, Jean-Jacques; Paredes, José Luis

    2015-01-01

    In this paper, we propose an automated method to classify normal/abnormal wall motion in Left Ventricle (LV) function in cardiac cine-Magnetic Resonance Imaging (MRI), taking as reference, strain information obtained from 2D Speckle Tracking Echocardiography (STE). Without the need of pre-processing and by exploiting all the images acquired during a cardiac cycle, spatio-temporal profiles are extracted from a subset of radial lines from the ventricle centroid to points outside the epicardial border. Classical Support Vector Machines (SVM) are used to classify features extracted from gray levels of the spatio-temporal profile as well as their representations in the Wavelet domain under the assumption that the data may be sparse in that domain. Based on information obtained from radial strain curves in 2D-STE studies, we label all the spatio-temporal profiles that belong to a particular segment as normal if the peak systolic radial strain curve of this segment presents normal kinesis, or abnormal if the peak systolic radial strain curve presents hypokinesis or akinesis. For this study, short-axis cine- MR images are collected from 9 patients with cardiac dyssynchrony for which we have the radial strain tracings at the mid-papilary muscle obtained by 2D STE; and from one control group formed by 9 healthy subjects. The best classification performance is obtained with the gray level information of the spatio-temporal profiles using a RBF kernel with 91.88% of accuracy, 92.75% of sensitivity and 91.52% of specificity.

  12. Settlement Prediction of Road Soft Foundation Using a Support Vector Machine (SVM Based on Measured Data

    Yu Huiling

    2016-01-01

    Full Text Available The suppor1t vector machine (SVM is a relatively new artificial intelligence technique which is increasingly being applied to geotechnical problems and is yielding encouraging results. SVM is a new machine learning method based on the statistical learning theory. A case study based on road foundation engineering project shows that the forecast results are in good agreement with the measured data. The SVM model is also compared with BP artificial neural network model and traditional hyperbola method. The prediction results indicate that the SVM model has a better prediction ability than BP neural network model and hyperbola method. Therefore, settlement prediction based on SVM model can reflect actual settlement process more correctly. The results indicate that it is effective and feasible to use this method and the nonlinear mapping relation between foundation settlement and its influence factor can be expressed well. It will provide a new method to predict foundation settlement.

  13. SVM-based posture identification with a single waist-located triaxial accelerometer

    Rodríguez Martín, Daniel Manuel; Samà Monsonís, Albert; Pérez López, Carlos; Català Mallofré, Andreu; Cabestany Moncusí, Joan; Rodríguez Molinero, Alejandro

    2013-01-01

    Analysis of human body movement is an important research area, specially for health applications. In order to assess the quality of life of people with mobility problems like Parkinson’s disease o stroke patients, it is crucial to monitor and assess their daily life activities. The main goal of this work is the characterization of basic activities using a single triaxial accelerometer located at the waist. This paper presents a novel postural detection algorithm based in SVM methods which is ...

  14. Diagnosis of Elevator Faults with LS-SVM Based on Optimization by K-CV

    Zhou Wan; Shilin Yi; Kun Li; Ran Tao; Min Gou; Xinshi Li; Shu Guo

    2015-01-01

    Several common elevator malfunctions were diagnosed with a least square support vector machine (LS-SVM). After acquiring vibration signals of various elevator functions, their energy characteristics and time domain indicators were extracted by theoretically analyzing the optimal wavelet packet, in order to construct a feature vector of malfunctions for identifying causes of the malfunctions as input of LS-SVM. Meanwhile, parameters about LS-SVM were optimized by K-fold cross validation (K-CV)...

  15. SVM-based Multiview Face Recognition by Generalization of Discriminant Analysis

    Kisku, Dakshina Ranjan; Sing, Jamuna Kanta; Gupta, Phalguni

    2010-01-01

    Identity verification of authentic persons by their multiview faces is a real valued problem in machine vision. Multiview faces are having difficulties due to non-linear representation in the feature space. This paper illustrates the usability of the generalization of LDA in the form of canonical covariate for face recognition to multiview faces. In the proposed work, the Gabor filter bank is used to extract facial features that characterized by spatial frequency, spatial locality and orientation. Gabor face representation captures substantial amount of variations of the face instances that often occurs due to illumination, pose and facial expression changes. Convolution of Gabor filter bank to face images of rotated profile views produce Gabor faces with high dimensional features vectors. Canonical covariate is then used to Gabor faces to reduce the high dimensional feature spaces into low dimensional subspaces. Finally, support vector machines are trained with canonical sub-spaces that contain reduced set o...

  16. SVM-based Multiview Face Recognition by Generalization of Discriminant Analysis

    Kisku, Dakshina Ranjan; Mehrotra, Hunny; Sing, Jamuna Kanta; Gupta, Phalguni

    2010-01-01

    Identity verification of authentic persons by their multiview faces is a real valued problem in machine vision. Multiview faces are having difficulties due to non-linear representation in the feature space. This paper illustrates the usability of the generalization of LDA in the form of canonical covariate for face recognition to multiview faces. In the proposed work, the Gabor filter bank is used to extract facial features that characterized by spatial frequency, spatial locality and orienta...

  17. SVM-based base-metal prospectivity modeling of the Aravalli Orogen, Northwestern India

    Porwal, Alok; Yu, Le; Gessner, Klaus

    2010-05-01

    The Proterozoic Aravalli orogen in the state of Rajasthan, northwestern India, constitutes the most important metallogenic province for base-metal deposits in India and hosts the entire economically viable lead-zinc resource-base of the country. The orogen evolved through near-orderly Wilson cycles of repeated extensional and compressional tectonics resulting in sequential opening and closing of intracratonic rifts and amalgamation of crustal domains during a circa 1.0-Ga geological history from 2.2 Ga to 1.0 Ga. This study develops a conceptual tectonostratigraphic model of the orogen based on a synthesis of the available geological, geophysical and geochronological data followed by deep-seismic-reflectivity-constrained 2-D forward gravity modeling, and links it to the Proterozoic base-metal metallogeny in the orogen in order to identify key geological controls on the base-metal mineralization. These controls are translated into exploration criteria for base-metal deposits, validated using empirical spatial analysis, and used to derive input spatial variables for model-based base-metal prospectivity mapping of the orogen. A support vector machine (SVM) algorithm augmented by incorporating a feature selection procedure is used in a GIS environment to implement the prospectivity mapping. A comparison of the SVM-derived prospectivity map with the ones derived using other established models such as neural-networks, logistic regression, and Bayesian weights-of-evidence indicates that the SVM outperforms other models, which is attributed to the capability of the SVM to return robust classification based on small training datasets.

  18. SVM Based Recognition of Facial Expressions Used In Indian Sign Language

    Daleesha M Viswanathan; Sumam Mary Idicula

    2015-01-01

    In sign language systems, facial expressions are an intrinsic component that usually accompanies hand gestures. The facial expressions would modify or change the meaning of hand gesture into a statement, a question or improve the meaning and understanding of hand gestures. The scientific literature available in Indian Sign Language (ISL) on facial expression recognition is scanty. Contrary to American Sign Language (ASL), head movements are less conspicuous in ISL and the answers to questions...

  19. SVM-Based CAC System for B-Mode Kidney Ultrasound Images.

    Subramanya, M B; Kumar, Vinod; Mukherjee, Shaktidev; Saini, Manju

    2015-08-01

    The present study proposes a computer-aided classification (CAC) system for three kidney classes, viz. normal, medical renal disease (MRD) and cyst using B-mode ultrasound images. Thirty-five B-mode kidney ultrasound images consisting of 11 normal images, 8 MRD images and 16 cyst images have been used. Regions of interest (ROIs) have been marked by the radiologist from the parenchyma region of the kidney in case of normal and MRD cases and from regions inside lesions for cyst cases. To evaluate the contribution of texture features extracted from de-speckled images for the classification task, original images have been pre-processed by eight de-speckling methods. Six categories of texture features are extracted. One-against-one multi-class support vector machine (SVM) classifier has been used for the present work. Based on overall classification accuracy (OCA), features from ROIs of original images are concatenated with the features from ROIs of pre-processed images. On the basis of OCA, few feature sets are considered for feature selection. Differential evolution feature selection (DEFS) has been used to select optimal features for the classification task. DEFS process is repeated 30 times to obtain 30 subsets. Run-length matrix features from ROIs of images pre-processed by Lee's sigma concatenated with that of enhanced Lee method have resulted in an average accuracy (in %) and standard deviation of 86.3 ± 1.6. The results obtained in the study indicate that the performance of the proposed CAC system is promising, and it can be used by the radiologists in routine clinical practice for the classification of renal diseases. PMID:25537457

  20. Component Content Soft-Sensor of SVM Based on Ions Color Characteristics

    Zhang Kunpeng

    2012-10-01

    Full Text Available In consideration of different characteristic colors of Ions in the P507-HCL Pr/Nd extraction separation system, ions color image feature H, S, I that closely related to the element component contents are extracted by using image processing method. Principal Component Analysis algorithm is employed to determine statistics mean of H, S, I which has the stronger correlation with element component content and the auxiliary variables are obtained. With the algorithm of support vector machine, a component contents soft-sensor model in Pr/Nd extraction process is established. Finally, simulations and tests verify the rationality and feasibility of the proposed method. The research results provide theoretical foundation for the online measurement of the component content in Pr/Nd countercurrent extraction separation process.

  1. Component Content Soft-Sensor of SVM Based on Ions Color Characteristics

    Zhang Kunpeng; Yang Hui; Lu Rongxiu

    2012-01-01

    In consideration of different characteristic colors of Ions in the P507-HCL Pr/Nd extraction separation system, ions color image feature H, S, I that closely related to the element component contents are extracted by using image processing method. Principal Component Analysis algorithm is employed to determine statistics mean of H, S, I which has the stronger correlation with element component content and the auxiliary variables are obtained. With the algorithm of support vector machine, a co...

  2. Restoring the Generalizability of SVM Based Decoding in High Dimensional Neuroimage Data

    Abrahamsen, Trine Julie; Hansen, Lars Kai

    2011-01-01

    Variance inflation is caused by a mismatch between linear projections of test and training data when projections are estimated on training sets smaller than the dimensionality of the feature space. We demonstrate that variance inflation can lead to an increased neuroimage decoding error rate for ...

  3. SVM-based feature extraction and classification of aflatoxin contaminated corn using fluorescence hyperspectral data

    Support Vector Machine (SVM) was used in the Genetic Algorithms (GA) process to select and classify a subset of hyperspectral image bands. The method was applied to fluorescence hyperspectral data for the detection of aflatoxin contamination in Aspergillus flavus infected single corn kernels. In the...

  4. SVM Based Recognition of Facial Expressions Used In Indian Sign Language

    Daleesha M Viswanathan

    2015-02-01

    Full Text Available In sign language systems, facial expressions are an intrinsic component that usually accompanies hand gestures. The facial expressions would modify or change the meaning of hand gesture into a statement, a question or improve the meaning and understanding of hand gestures. The scientific literature available in Indian Sign Language (ISL on facial expression recognition is scanty. Contrary to American Sign Language (ASL, head movements are less conspicuous in ISL and the answers to questions such as yes or no are signed by hand. Purpose of this paper is to present our work in recognizing facial expression changes in isolated ISL sentences. Facial gesture pattern results in the change of skin textures by forming wrinkles and furrows. Gabor wavelet method is well-known for capturing subtle textural changes on surfaces. Therefore, a unique approach was developed to model facial expression changes with Gabor wavelet parameters that were chosen from partitioned face areas. These parameters were incorporated with Euclidian distance measure. Multi class SVM classifier was used in this recognition system to identify facial expressions in an isolated facial expression sequences in ISL. An accuracy of 92.12 % was achieved by our proposed system.

  5. SVM-Based Spectrum Mobility Prediction Scheme in Mobile Cognitive Radio Networks

    Yao Wang; Zhongzhao Zhang; Lin Ma; Jiamei Chen

    2014-01-01

    Spectrum mobility as an essential issue has not been fully investigated in mobile cognitive radio networks (CRNs). In this paper, a novel support vector machine based spectrum mobility prediction (SVM-SMP) scheme is presented considering time-varying and space-varying characteristics simultaneously in mobile CRNs. The mobility of cognitive users (CUs) and the working activities of primary users (PUs) are analyzed in theory. And a joint feature vector extraction (JFVE) method is proposed based...

  6. A SVM-based method for sentiment analysis in Persian language

    Hajmohammadi, Mohammad Sadegh; Ibrahim, Roliana

    2013-03-01

    Persian language is the official language of Iran, Tajikistan and Afghanistan. Local online users often represent their opinions and experiences on the web with written Persian. Although the information in those reviews is valuable to potential consumers and sellers, the huge amount of web reviews make it difficult to give an unbiased evaluation to a product. In this paper, standard machine learning techniques SVM and naive Bayes are incorporated into the domain of online Persian Movie reviews to automatically classify user reviews as positive or negative and performance of these two classifiers is compared with each other in this language. The effects of feature presentations on classification performance are discussed. We find that accuracy is influenced by interaction between the classification models and the feature options. The SVM classifier achieves as well as or better accuracy than naive Bayes in Persian movie. Unigrams are proved better features than bigrams and trigrams in capturing Persian sentiment orientation.

  7. Linear SVM-Based Android Malware Detection for Reliable IoT Services

    Hyo-Sik Ham; Hwan-Hee Kim; Myung-Sup Kim; Mi-Jung Choi

    2014-01-01

    Current many Internet of Things (IoT) services are monitored and controlled through smartphone applications. By combining IoT with smartphones, many convenient IoT services have been provided to users. However, there are adverse underlying effects in such services including invasion of privacy and information leakage. In most cases, mobile devices have become cluttered with important personal user information as various services and contents are provided through them. Accordingly, attackers a...

  8. An SVM-Based Solution for Fault Detection in Wind Turbines

    Santos, Pedro; Villa, Luisa F.; Reñones, Aníbal; Bustillo, Andres; Maudes, Jesús

    2015-01-01

    Research into fault diagnosis in machines with a wide range of variable loads and speeds, such as wind turbines, is of great industrial interest. Analysis of the power signals emitted by wind turbines for the diagnosis of mechanical faults in their mechanical transmission chain is insufficient. A successful diagnosis requires the inclusion of accelerometers to evaluate vibrations. This work presents a multi-sensory system for fault diagnosis in wind turbines, combined with a data-mining solution for the classification of the operational state of the turbine. The selected sensors are accelerometers, in which vibration signals are processed using angular resampling techniques and electrical, torque and speed measurements. Support vector machines (SVMs) are selected for the classification task, including two traditional and two promising new kernels. This multi-sensory system has been validated on a test-bed that simulates the real conditions of wind turbines with two fault typologies: misalignment and imbalance. Comparison of SVM performance with the results of artificial neural networks (ANNs) shows that linear kernel SVM outperforms other kernels and ANNs in terms of accuracy, training and tuning times. The suitability and superior performance of linear SVM is also experimentally analyzed, to conclude that this data acquisition technique generates linearly separable datasets. PMID:25760051

  9. [Application of optimized parameters SVM based on photoacoustic spectroscopy method in fault diagnosis of power transformer].

    Zhang, Yu-xin; Cheng, Zhi-feng; Xu, Zheng-ping; Bai, Jing

    2015-01-01

    In order to solve the problems such as complex operation, consumption for the carrier gas and long test period in traditional power transformer fault diagnosis approach based on dissolved gas analysis (DGA), this paper proposes a new method which is detecting 5 types of characteristic gas content in transformer oil such as CH4, C2H2, C2H4, C2H6 and H2 based on photoacoustic Spectroscopy and C2H2/C2H4, CH4/H2, C2H4/C2H6 three-ratios data are calculated. The support vector machine model was constructed using cross validation method under five support vector machine functions and four kernel functions, heuristic algorithms were used in parameter optimization for penalty factor c and g, which to establish the best SVM model for the highest fault diagnosis accuracy and the fast computing speed. Particles swarm optimization and genetic algorithm two types of heuristic algorithms were comparative studied in this paper for accuracy and speed in optimization. The simulation result shows that SVM model composed of C-SVC, RBF kernel functions and genetic algorithm obtain 97. 5% accuracy in test sample set and 98. 333 3% accuracy in train sample set, and genetic algorithm was about two times faster than particles swarm optimization in computing speed. The methods described in this paper has many advantages such as simple operation, non-contact measurement, no consumption for the carrier gas, long test period, high stability and sensitivity, the result shows that the methods described in this paper can instead of the traditional transformer fault diagnosis by gas chromatography and meets the actual project needs in transformer fault diagnosis. PMID:25993810

  10. SVM Based Navigation Patterns for Efficient Relevance Feedback in Content Based Image Retrieval

    Vaishali Meshram

    2012-07-01

    Full Text Available Image retrieval based on image content has become a hot topic in the field of image processing and computer vision. Content-based image retrieval (CBIR is the basis of image retrieval systems. Unlike traditional database queries, content-based multimedia retrieval queries are imprecise in nature which makes it difficult for users to express their exact information need in the form of a precise right query. To be more profitable, relevance feedback techniques were incorporated into CBIR such that more precise results can be obtained by taking user’s feedbacks into account. However, existing relevance feedback based CBIR methods usually request a number of iterative feedbacks to produce refined search results, especially in a large-scale image database. This is impractical and inefficient in real applications. This paper studies about the research on ways to extend and improve query methods for image databases is widespread, we have developed the QBIC (Query by Image Content system to explore content-based retrieval methods. To achieve the high efficiency and effectiveness of CBIR we are using two type of methods for feature extraction like SVM (support vector machineand NPRF(navigation-pattern based relevance feedback. By using SVM classifier as a category predictor of query and database images, they are exploited at first to filter out irrelevant images by its different low-level, concept and key point-based features. Thus we may reduce the size of query search in the data base then we may apply NPRF algorithm and refinement strategies for further extraction. In terms of efficiency, the iterations of feedback are reduced by using the navigation patterns discovered from the user query log. In terms of effectiveness, the proposed search algorithm NPRF Search makes use of the discovered navigation patterns and three kinds of query refinement strategies, Query Point Movement (QPM, Query Reweighting (QR, and Query Expansion (QEX, to converge the search space toward the user’s intention effectively. By using NPRF method, high quality of image retrieval on RF can be achieved in a small number of feedbacks

  11. SVM-Based Synthetic Fingerprint Discrimination Algorithm and Quantitative Optimization Strategy

    Chen, Suhang; Chang, Sheng; Huang, Qijun; He, Jin; Wang, Hao; Huang, Qiangui

    2014-01-01

    Synthetic fingerprints are a potential threat to automatic fingerprint identification systems (AFISs). In this paper, we propose an algorithm to discriminate synthetic fingerprints from real ones. First, four typical characteristic factors—the ridge distance features, global gray features, frequency feature and Harris Corner feature—are extracted. Then, a support vector machine (SVM) is used to distinguish synthetic fingerprints from real fingerprints. The experiments demonstrate that this method can achieve a recognition accuracy rate of over 98% for two discrete synthetic fingerprint databases as well as a mixed database. Furthermore, a performance factor that can evaluate the SVM's accuracy and efficiency is presented, and a quantitative optimization strategy is established for the first time. After the optimization of our synthetic fingerprint discrimination task, the polynomial kernel with a training sample proportion of 5% is the optimized value when the minimum accuracy requirement is 95%. The radial basis function (RBF) kernel with a training sample proportion of 15% is a more suitable choice when the minimum accuracy requirement is 98%. PMID:25347063

  12. Linear SVM-Based Android Malware Detection for Reliable IoT Services

    Hyo-Sik Ham

    2014-01-01

    Full Text Available Current many Internet of Things (IoT services are monitored and controlled through smartphone applications. By combining IoT with smartphones, many convenient IoT services have been provided to users. However, there are adverse underlying effects in such services including invasion of privacy and information leakage. In most cases, mobile devices have become cluttered with important personal user information as various services and contents are provided through them. Accordingly, attackers are expanding the scope of their attacks beyond the existing PC and Internet environment into mobile devices. In this paper, we apply a linear support vector machine (SVM to detect Android malware and compare the malware detection performance of SVM with that of other machine learning classifiers. Through experimental validation, we show that the SVM outperforms other machine learning classifiers.

  13. Fast SVM-based Feature Elimination Utilizing Data Radius, Hard-Margin, Soft-Margin

    Aksu, Yaman

    2012-01-01

    Margin maximization in the hard-margin sense, proposed as feature elimination criterion by the MFE-LO method, is combined here with data radius utilization to further aim to lower generalization error, as several published bounds and bound-related formulations pertaining to lowering misclassification risk (or error) pertain to radius e.g. product of squared radius and weight vector squared norm. Additionally, we propose additional novel feature elimination criteria that, while instead being i...

  14. Fast and Accurate Construction of Confidence Intervals for Heritability.

    Schweiger, Regev; Kaufman, Shachar; Laaksonen, Reijo; Kleber, Marcus E; März, Winfried; Eskin, Eleazar; Rosset, Saharon; Halperin, Eran

    2016-06-01

    Estimation of heritability is fundamental in genetic studies. Recently, heritability estimation using linear mixed models (LMMs) has gained popularity because these estimates can be obtained from unrelated individuals collected in genome-wide association studies. Typically, heritability estimation under LMMs uses the restricted maximum likelihood (REML) approach. Existing methods for the construction of confidence intervals and estimators of SEs for REML rely on asymptotic properties. However, these assumptions are often violated because of the bounded parameter space, statistical dependencies, and limited sample size, leading to biased estimates and inflated or deflated confidence intervals. Here, we show that the estimation of confidence intervals by state-of-the-art methods is inaccurate, especially when the true heritability is relatively low or relatively high. We further show that these inaccuracies occur in datasets including thousands of individuals. Such biases are present, for example, in estimates of heritability of gene expression in the Genotype-Tissue Expression project and of lipid profiles in the Ludwigshafen Risk and Cardiovascular Health study. We also show that often the probability that the genetic component is estimated as 0 is high even when the true heritability is bounded away from 0, emphasizing the need for accurate confidence intervals. We propose a computationally efficient method, ALBI (accurate LMM-based heritability bootstrap confidence intervals), for estimating the distribution of the heritability estimator and for constructing accurate confidence intervals. Our method can be used as an add-on to existing methods for estimating heritability and variance components, such as GCTA, FaST-LMM, GEMMA, or EMMAX. PMID:27259052

  15. Accurate phylogenetic classification of DNA fragments based onsequence composition

    McHardy, Alice C.; Garcia Martin, Hector; Tsirigos, Aristotelis; Hugenholtz, Philip; Rigoutsos, Isidore

    2006-05-01

    Metagenome studies have retrieved vast amounts of sequenceout of a variety of environments, leading to novel discoveries and greatinsights into the uncultured microbial world. Except for very simplecommunities, diversity makes sequence assembly and analysis a verychallenging problem. To understand the structure a 5 nd function ofmicrobial communities, a taxonomic characterization of the obtainedsequence fragments is highly desirable, yet currently limited mostly tothose sequences that contain phylogenetic marker genes. We show that forclades at the rank of domain down to genus, sequence composition allowsthe very accurate phylogenetic 10 characterization of genomic sequence.We developed a composition-based classifier, PhyloPythia, for de novophylogenetic sequence characterization and have trained it on adata setof 340 genomes. By extensive evaluation experiments we show that themethodis accurate across all taxonomic ranks considered, even forsequences that originate fromnovel organisms and are as short as 1kb.Application to two metagenome datasets 15 obtained from samples ofphosphorus-removing sludge showed that the method allows the accurateclassification at genus level of most sequence fragments from thedominant populations, while at the same time correctly characterizingeven larger parts of the samples at higher taxonomic levels.

  16. Accurate diagnosis is essential for amebiasis

    2004-01-01

    @@ Amebiasis is one of the three most common causes of death from parasitic disease, and Entamoeba histolytica is the most widely distributed parasites in the world. Particularly, Entamoeba histolytica infection in the developing countries is a significant health problem in amebiasis-endemic areas with a significant impact on infant mortality[1]. In recent years a world wide increase in the number of patients with amebiasis has refocused attention on this important infection. On the other hand, improving the quality of parasitological methods and widespread use of accurate tecniques have improved our knowledge about the disease.

  17. Investigations on Accurate Analysis of Microstrip Reflectarrays

    Zhou, Min; Sørensen, S. B.; Kim, Oleksiy S.;

    2011-01-01

    An investigation on accurate analysis of microstrip reflectarrays is presented. Sources of error in reflectarray analysis are examined and solutions to these issues are proposed. The focus is on two sources of error, namely the determination of the equivalent currents to calculate the radiation...... pattern, and the inaccurate mutual coupling between array elements due to the lack of periodicity. To serve as reference, two offset reflectarray antennas have been designed, manufactured and measured at the DTUESA Spherical Near-Field Antenna Test Facility. Comparisons of simulated and measured data are...

  18. Niche Genetic Algorithm with Accurate Optimization Performance

    LIU Jian-hua; YAN De-kun

    2005-01-01

    Based on crowding mechanism, a novel niche genetic algorithm was proposed which can record evolutionary direction dynamically during evolution. After evolution, the solutions's precision can be greatly improved by means of the local searching along the recorded direction. Simulation shows that this algorithm can not only keep population diversity but also find accurate solutions. Although using this method has to take more time compared with the standard GA, it is really worth applying to some cases that have to meet a demand for high solution precision.

  19. How accurately can we calculate thermal systems?

    The objective was to determine how accurately simple reactor lattice integral parameters can be determined, considering user input, differences in the methods, source data and the data processing procedures and assumptions. Three simple square lattice test cases with different fuel to moderator ratios were defined. The effect of the thermal scattering models were shown to be important and much bigger than the spread in the results. Nevertheless, differences of up to 0.4% in the K-eff calculated by continuous energy Monte Carlo codes were observed even when the same source data were used. (author)

  20. Evidence Based Selection of Housekeeping Genes

    de Jonge, Hendrik J.M.; Fehrmann, Rudolf S. N.; Eveline S. J. M. de Bont; Hofstra, Robert M. W.; Gerbens, Frans; Kamps, Willem A.; Vries, Elisabeth G. E.; van der Zee, Ate G.J.; te Meerman, Gerard J.; ter Elst, Arja

    2007-01-01

    For accurate and reliable gene expression analysis, normalization of gene expression data against housekeeping genes (reference or internal control genes) is required. It is known that commonly used housekeeping genes (e.g. ACTB, GAPDH, HPRT1, and B2M) vary considerably under different experimental conditions and therefore their use for normalization is limited. We performed a meta-analysis of 13,629 human gene array samples in order to identify the most stable expressed genes. Here we show n...

  1. Accurate determination of characteristic relative permeability curves

    Krause, Michael H.; Benson, Sally M.

    2015-09-01

    A recently developed technique to accurately characterize sub-core scale heterogeneity is applied to investigate the factors responsible for flowrate-dependent effective relative permeability curves measured on core samples in the laboratory. The dependency of laboratory measured relative permeability on flowrate has long been both supported and challenged by a number of investigators. Studies have shown that this apparent flowrate dependency is a result of both sub-core scale heterogeneity and outlet boundary effects. However this has only been demonstrated numerically for highly simplified models of porous media. In this paper, flowrate dependency of effective relative permeability is demonstrated using two rock cores, a Berea Sandstone and a heterogeneous sandstone from the Otway Basin Pilot Project in Australia. Numerical simulations of steady-state coreflooding experiments are conducted at a number of injection rates using a single set of input characteristic relative permeability curves. Effective relative permeability is then calculated from the simulation data using standard interpretation methods for calculating relative permeability from steady-state tests. Results show that simplified approaches may be used to determine flowrate-independent characteristic relative permeability provided flow rate is sufficiently high, and the core heterogeneity is relatively low. It is also shown that characteristic relative permeability can be determined at any typical flowrate, and even for geologically complex models, when using accurate three-dimensional models.

  2. Accurate radiative transfer calculations for layered media.

    Selden, Adrian C

    2016-07-01

    Simple yet accurate results for radiative transfer in layered media with discontinuous refractive index are obtained by the method of K-integrals. These are certain weighted integrals applied to the angular intensity distribution at the refracting boundaries. The radiative intensity is expressed as the sum of the asymptotic angular intensity distribution valid in the depth of the scattering medium and a transient term valid near the boundary. Integrated boundary equations are obtained, yielding simple linear equations for the intensity coefficients, enabling the angular emission intensity and the diffuse reflectance (albedo) and transmittance of the scattering layer to be calculated without solving the radiative transfer equation directly. Examples are given of half-space, slab, interface, and double-layer calculations, and extensions to multilayer systems are indicated. The K-integral method is orders of magnitude more accurate than diffusion theory and can be applied to layered scattering media with a wide range of scattering albedos, with potential applications to biomedical and ocean optics. PMID:27409700

  3. Accurate basis set truncation for wavefunction embedding

    Barnes, Taylor A.; Goodpaster, Jason D.; Manby, Frederick R.; Miller, Thomas F.

    2013-07-01

    Density functional theory (DFT) provides a formally exact framework for performing embedded subsystem electronic structure calculations, including DFT-in-DFT and wavefunction theory-in-DFT descriptions. In the interest of efficiency, it is desirable to truncate the atomic orbital basis set in which the subsystem calculation is performed, thus avoiding high-order scaling with respect to the size of the MO virtual space. In this study, we extend a recently introduced projection-based embedding method [F. R. Manby, M. Stella, J. D. Goodpaster, and T. F. Miller III, J. Chem. Theory Comput. 8, 2564 (2012)], 10.1021/ct300544e to allow for the systematic and accurate truncation of the embedded subsystem basis set. The approach is applied to both covalently and non-covalently bound test cases, including water clusters and polypeptide chains, and it is demonstrated that errors associated with basis set truncation are controllable to well within chemical accuracy. Furthermore, we show that this approach allows for switching between accurate projection-based embedding and DFT embedding with approximate kinetic energy (KE) functionals; in this sense, the approach provides a means of systematically improving upon the use of approximate KE functionals in DFT embedding.

  4. Accurate pose estimation for forensic identification

    Merckx, Gert; Hermans, Jeroen; Vandermeulen, Dirk

    2010-04-01

    In forensic authentication, one aims to identify the perpetrator among a series of suspects or distractors. A fundamental problem in any recognition system that aims for identification of subjects in a natural scene is the lack of constrains on viewing and imaging conditions. In forensic applications, identification proves even more challenging, since most surveillance footage is of abysmal quality. In this context, robust methods for pose estimation are paramount. In this paper we will therefore present a new pose estimation strategy for very low quality footage. Our approach uses 3D-2D registration of a textured 3D face model with the surveillance image to obtain accurate far field pose alignment. Starting from an inaccurate initial estimate, the technique uses novel similarity measures based on the monogenic signal to guide a pose optimization process. We will illustrate the descriptive strength of the introduced similarity measures by using them directly as a recognition metric. Through validation, using both real and synthetic surveillance footage, our pose estimation method is shown to be accurate, and robust to lighting changes and image degradation.

  5. Accurate shear measurement with faint sources

    Zhang, Jun; Foucaud, Sebastien [Center for Astronomy and Astrophysics, Department of Physics and Astronomy, Shanghai Jiao Tong University, 955 Jianchuan road, Shanghai, 200240 (China); Luo, Wentao, E-mail: betajzhang@sjtu.edu.cn, E-mail: walt@shao.ac.cn, E-mail: foucaud@sjtu.edu.cn [Key Laboratory for Research in Galaxies and Cosmology, Shanghai Astronomical Observatory, Nandan Road 80, Shanghai, 200030 (China)

    2015-01-01

    For cosmic shear to become an accurate cosmological probe, systematic errors in the shear measurement method must be unambiguously identified and corrected for. Previous work of this series has demonstrated that cosmic shears can be measured accurately in Fourier space in the presence of background noise and finite pixel size, without assumptions on the morphologies of galaxy and PSF. The remaining major source of error is source Poisson noise, due to the finiteness of source photon number. This problem is particularly important for faint galaxies in space-based weak lensing measurements, and for ground-based images of short exposure times. In this work, we propose a simple and rigorous way of removing the shear bias from the source Poisson noise. Our noise treatment can be generalized for images made of multiple exposures through MultiDrizzle. This is demonstrated with the SDSS and COSMOS/ACS data. With a large ensemble of mock galaxy images of unrestricted morphologies, we show that our shear measurement method can achieve sub-percent level accuracy even for images of signal-to-noise ratio less than 5 in general, making it the most promising technique for cosmic shear measurement in the ongoing and upcoming large scale galaxy surveys.

  6. Microarray results: how accurate are they?

    Mane Shrikant

    2002-08-01

    Full Text Available Abstract Background DNA microarray technology is a powerful technique that was recently developed in order to analyze thousands of genes in a short time. Presently, microarrays, or chips, of the cDNA type and oligonucleotide type are available from several sources. The number of publications in this area is increasing exponentially. Results In this study, microarray data obtained from two different commercially available systems were critically evaluated. Our analysis revealed several inconsistencies in the data obtained from the two different microarrays. Problems encountered included inconsistent sequence fidelity of the spotted microarrays, variability of differential expression, low specificity of cDNA microarray probes, discrepancy in fold-change calculation and lack of probe specificity for different isoforms of a gene. Conclusions In view of these pitfalls, data from microarray analysis need to be interpreted cautiously.

  7. Accurate valence band width of diamond

    An accurate width is determined for the valence band of diamond by imaging photoelectron momentum distributions for a variety of initial- and final-state energies. The experimental result of 23.0±0.2 eV2 agrees well with first-principles quasiparticle calculations (23.0 and 22.88 eV) and significantly exceeds the local-density-functional width, 21.5±0.2 eV2. This difference quantifies effects of creating an excited hole state (with associated many-body effects) in a band measurement vs studying ground-state properties treated by local-density-functional calculations. copyright 1997 The American Physical Society

  8. Toward Accurate and Quantitative Comparative Metagenomics.

    Nayfach, Stephen; Pollard, Katherine S

    2016-08-25

    Shotgun metagenomics and computational analysis are used to compare the taxonomic and functional profiles of microbial communities. Leveraging this approach to understand roles of microbes in human biology and other environments requires quantitative data summaries whose values are comparable across samples and studies. Comparability is currently hampered by the use of abundance statistics that do not estimate a meaningful parameter of the microbial community and biases introduced by experimental protocols and data-cleaning approaches. Addressing these challenges, along with improving study design, data access, metadata standardization, and analysis tools, will enable accurate comparative metagenomics. We envision a future in which microbiome studies are replicable and new metagenomes are easily and rapidly integrated with existing data. Only then can the potential of metagenomics for predictive ecological modeling, well-powered association studies, and effective microbiome medicine be fully realized. PMID:27565341

  9. Accurate Telescope Mount Positioning with MEMS Accelerometers

    Mészáros, László; Pál, András; Csépány, Gergely

    2014-01-01

    This paper describes the advantages and challenges of applying microelectromechanical accelerometer systems (MEMS accelerometers) in order to attain precise, accurate and stateless positioning of telescope mounts. This provides a completely independent method from other forms of electronic, optical, mechanical or magnetic feedback or real-time astrometry. Our goal is to reach the sub-arcminute range which is well smaller than the field-of-view of conventional imaging telescope systems. Here we present how this sub-arcminute accuracy can be achieved with very cheap MEMS sensors and we also detail how our procedures can be extended in order to attain even finer measurements. In addition, our paper discusses how can a complete system design be implemented in order to be a part of a telescope control system.

  10. Accurate estimation of indoor travel times

    Prentow, Thor Siiger; Blunck, Henrik; Stisen, Allan;

    2014-01-01

    the InTraTime method for accurately estimating indoor travel times via mining of historical and real-time indoor position traces. The method learns during operation both travel routes, travel times and their respective likelihood---both for routes traveled as well as for sub-routes thereof. InTraTime...... allows to specify temporal and other query parameters, such as time-of-day, day-of-week or the identity of the traveling individual. As input the method is designed to take generic position traces and is thus interoperable with a variety of indoor positioning systems. The method's advantages include...... a minimal-effort setup and self-improving operations due to unsupervised learning---as it is able to adapt implicitly to factors influencing indoor travel times such as elevators, rotating doors or changes in building layout. We evaluate and compare the proposed InTraTime method to indoor adaptions...

  11. Accurate sky background modelling for ESO facilities

    Full text: Ground-based measurements like e.g. high resolution spectroscopy are heavily influenced by several physical processes. Amongst others, line absorption/ emission, air glow by OH molecules, and scattering of photons within the earth's atmosphere make observations in particular from facilities like the future European extremely large telescope a challenge. Additionally, emission from unresolved extrasolar objects, the zodiacal light, the moon and even thermal emission from the telescope and the instrument contribute significantly to the broad band background over a wide wavelength range. In our talk we review these influences and give an overview on how they can be accurately modeled for increasing the overall precision of spectroscopic and imaging measurements. (author)

  12. Accurate Weather Forecasting for Radio Astronomy

    Maddalena, Ronald J.

    2010-01-01

    The NRAO Green Bank Telescope routinely observes at wavelengths from 3 mm to 1 m. As with all mm-wave telescopes, observing conditions depend upon the variable atmospheric water content. The site provides over 100 days/yr when opacities are low enough for good observing at 3 mm, but winds on the open-air structure reduce the time suitable for 3-mm observing where pointing is critical. Thus, to maximum productivity the observing wavelength needs to match weather conditions. For 6 years the telescope has used a dynamic scheduling system (recently upgraded; www.gb.nrao.edu/DSS) that requires accurate multi-day forecasts for winds and opacities. Since opacity forecasts are not provided by the National Weather Services (NWS), I have developed an automated system that takes available forecasts, derives forecasted opacities, and deploys the results on the web in user-friendly graphical overviews (www.gb.nrao.edu/ rmaddale/Weather). The system relies on the "North American Mesoscale" models, which are updated by the NWS every 6 hrs, have a 12 km horizontal resolution, 1 hr temporal resolution, run to 84 hrs, and have 60 vertical layers that extend to 20 km. Each forecast consists of a time series of ground conditions, cloud coverage, etc, and, most importantly, temperature, pressure, humidity as a function of height. I use the Liebe's MWP model (Radio Science, 20, 1069, 1985) to determine the absorption in each layer for each hour for 30 observing wavelengths. Radiative transfer provides, for each hour and wavelength, the total opacity and the radio brightness of the atmosphere, which contributes substantially at some wavelengths to Tsys and the observational noise. Comparisons of measured and forecasted Tsys at 22.2 and 44 GHz imply that the forecasted opacities are good to about 0.01 Nepers, which is sufficient for forecasting and accurate calibration. Reliability is high out to 2 days and degrades slowly for longer-range forecasts.

  13. Approaching system equilibrium with accurate or not accurate feedback information in a two-route system

    Zhao, Xiao-mei; Xie, Dong-fan; Li, Qi

    2015-02-01

    With the development of intelligent transport system, advanced information feedback strategies have been developed to reduce traffic congestion and enhance the capacity. However, previous strategies provide accurate information to travelers and our simulation results show that accurate information brings negative effects, especially in delay case. Because travelers prefer to the best condition route with accurate information, and delayed information cannot reflect current traffic condition but past. Then travelers make wrong routing decisions, causing the decrease of the capacity and the increase of oscillations and the system deviating from the equilibrium. To avoid the negative effect, bounded rationality is taken into account by introducing a boundedly rational threshold BR. When difference between two routes is less than the BR, routes have equal probability to be chosen. The bounded rationality is helpful to improve the efficiency in terms of capacity, oscillation and the gap deviating from the system equilibrium.

  14. Many accurate small-discriminatory feature subsets exist in microarray transcript data: biomarker discovery

    Grate Leslie R

    2005-04-01

    Full Text Available Abstract Background Molecular profiling generates abundance measurements for thousands of gene transcripts in biological samples such as normal and tumor tissues (data points. Given such two-class high-dimensional data, many methods have been proposed for classifying data points into one of the two classes. However, finding very small sets of features able to correctly classify the data is problematic as the fundamental mathematical proposition is hard. Existing methods can find "small" feature sets, but give no hint how close this is to the true minimum size. Without fundamental mathematical advances, finding true minimum-size sets will remain elusive, and more importantly for the microarray community there will be no methods for finding them. Results We use the brute force approach of exhaustive search through all genes, gene pairs (and for some data sets gene triples. Each unique gene combination is analyzed with a few-parameter linear-hyperplane classification method looking for those combinations that form training error-free classifiers. All 10 published data sets studied are found to contain predictive small feature sets. Four contain thousands of gene pairs and 6 have single genes that perfectly discriminate. Conclusion This technique discovered small sets of genes (3 or less in published data that form accurate classifiers, yet were not reported in the prior publications. This could be a common characteristic of microarray data, thus making looking for them worth the computational cost. Such small gene sets could indicate biomarkers and portend simple medical diagnostic tests. We recommend checking for small gene sets routinely. We find 4 gene pairs and many gene triples in the large hepatocellular carcinoma (HCC, Liver cancer data set of Chen et al. The key component of these is the "placental gene of unknown function", PLAC8. Our HMM modeling indicates PLAC8 might have a domain like part of lP59's crystal structure (a Non

  15. Accurate fission data for nuclear safety

    Solders, A; Jokinen, A; Kolhinen, V S; Lantz, M; Mattera, A; Penttila, H; Pomp, S; Rakopoulos, V; Rinta-Antila, S

    2013-01-01

    The Accurate fission data for nuclear safety (AlFONS) project aims at high precision measurements of fission yields, using the renewed IGISOL mass separator facility in combination with a new high current light ion cyclotron at the University of Jyvaskyla. The 30 MeV proton beam will be used to create fast and thermal neutron spectra for the study of neutron induced fission yields. Thanks to a series of mass separating elements, culminating with the JYFLTRAP Penning trap, it is possible to achieve a mass resolving power in the order of a few hundred thousands. In this paper we present the experimental setup and the design of a neutron converter target for IGISOL. The goal is to have a flexible design. For studies of exotic nuclei far from stability a high neutron flux (10^12 neutrons/s) at energies 1 - 30 MeV is desired while for reactor applications neutron spectra that resembles those of thermal and fast nuclear reactors are preferred. It is also desirable to be able to produce (semi-)monoenergetic neutrons...

  16. Fast and Provably Accurate Bilateral Filtering.

    Chaudhury, Kunal N; Dabhade, Swapnil D

    2016-06-01

    The bilateral filter is a non-linear filter that uses a range filter along with a spatial filter to perform edge-preserving smoothing of images. A direct computation of the bilateral filter requires O(S) operations per pixel, where S is the size of the support of the spatial filter. In this paper, we present a fast and provably accurate algorithm for approximating the bilateral filter when the range kernel is Gaussian. In particular, for box and Gaussian spatial filters, the proposed algorithm can cut down the complexity to O(1) per pixel for any arbitrary S . The algorithm has a simple implementation involving N+1 spatial filterings, where N is the approximation order. We give a detailed analysis of the filtering accuracy that can be achieved by the proposed approximation in relation to the target bilateral filter. This allows us to estimate the order N required to obtain a given accuracy. We also present comprehensive numerical results to demonstrate that the proposed algorithm is competitive with the state-of-the-art methods in terms of speed and accuracy. PMID:27093722

  17. Accurate adiabatic correction in the hydrogen molecule

    Pachucki, Krzysztof, E-mail: krp@fuw.edu.pl [Faculty of Physics, University of Warsaw, Pasteura 5, 02-093 Warsaw (Poland); Komasa, Jacek, E-mail: komasa@man.poznan.pl [Faculty of Chemistry, Adam Mickiewicz University, Umultowska 89b, 61-614 Poznań (Poland)

    2014-12-14

    A new formalism for the accurate treatment of adiabatic effects in the hydrogen molecule is presented, in which the electronic wave function is expanded in the James-Coolidge basis functions. Systematic increase in the size of the basis set permits estimation of the accuracy. Numerical results for the adiabatic correction to the Born-Oppenheimer interaction energy reveal a relative precision of 10{sup −12} at an arbitrary internuclear distance. Such calculations have been performed for 88 internuclear distances in the range of 0 < R ⩽ 12 bohrs to construct the adiabatic correction potential and to solve the nuclear Schrödinger equation. Finally, the adiabatic correction to the dissociation energies of all rovibrational levels in H{sub 2}, HD, HT, D{sub 2}, DT, and T{sub 2} has been determined. For the ground state of H{sub 2} the estimated precision is 3 × 10{sup −7} cm{sup −1}, which is almost three orders of magnitude higher than that of the best previous result. The achieved accuracy removes the adiabatic contribution from the overall error budget of the present day theoretical predictions for the rovibrational levels.

  18. Rapid and accurate pyrosequencing of angiosperm plastid genomes

    Farmerie William G

    2006-08-01

    Full Text Available Abstract Background Plastid genome sequence information is vital to several disciplines in plant biology, including phylogenetics and molecular biology. The past five years have witnessed a dramatic increase in the number of completely sequenced plastid genomes, fuelled largely by advances in conventional Sanger sequencing technology. Here we report a further significant reduction in time and cost for plastid genome sequencing through the successful use of a newly available pyrosequencing platform, the Genome Sequencer 20 (GS 20 System (454 Life Sciences Corporation, to rapidly and accurately sequence the whole plastid genomes of the basal eudicot angiosperms Nandina domestica (Berberidaceae and Platanus occidentalis (Platanaceae. Results More than 99.75% of each plastid genome was simultaneously obtained during two GS 20 sequence runs, to an average depth of coverage of 24.6× in Nandina and 17.3× in Platanus. The Nandina and Platanus plastid genomes shared essentially identical gene complements and possessed the typical angiosperm plastid structure and gene arrangement. To assess the accuracy of the GS 20 sequence, over 45 kilobases of sequence were generated for each genome using conventional sequencing. Overall error rates of 0.043% and 0.031% were observed in GS 20 sequence for Nandina and Platanus, respectively. More than 97% of all observed errors were associated with homopolymer runs, with ~60% of all errors associated with homopolymer runs of 5 or more nucleotides and ~50% of all errors associated with regions of extensive homopolymer runs. No substitution errors were present in either genome. Error rates were generally higher in the single-copy and noncoding regions of both plastid genomes relative to the inverted repeat and coding regions. Conclusion Highly accurate and essentially complete sequence information was obtained for the Nandina and Platanus plastid genomes using the GS 20 System. More importantly, the high accuracy

  19. Accurate paleointensities - the multi-method approach

    de Groot, Lennart

    2016-04-01

    The accuracy of models describing rapid changes in the geomagnetic field over the past millennia critically depends on the availability of reliable paleointensity estimates. Over the past decade methods to derive paleointensities from lavas (the only recorder of the geomagnetic field that is available all over the globe and through geologic times) have seen significant improvements and various alternative techniques were proposed. The 'classical' Thellier-style approach was optimized and selection criteria were defined in the 'Standard Paleointensity Definitions' (Paterson et al, 2014). The Multispecimen approach was validated and the importance of additional tests and criteria to assess Multispecimen results must be emphasized. Recently, a non-heating, relative paleointensity technique was proposed -the pseudo-Thellier protocol- which shows great potential in both accuracy and efficiency, but currently lacks a solid theoretical underpinning. Here I present work using all three of the aforementioned paleointensity methods on suites of young lavas taken from the volcanic islands of Hawaii, La Palma, Gran Canaria, Tenerife, and Terceira. Many of the sampled cooling units are <100 years old, the actual field strength at the time of cooling is therefore reasonably well known. Rather intuitively, flows that produce coherent results from two or more different paleointensity methods yield the most accurate estimates of the paleofield. Furthermore, the results for some flows pass the selection criteria for one method, but fail in other techniques. Scrutinizing and combing all acceptable results yielded reliable paleointensity estimates for 60-70% of all sampled cooling units - an exceptionally high success rate. This 'multi-method paleointensity approach' therefore has high potential to provide the much-needed paleointensities to improve geomagnetic field models for the Holocene.

  20. Towards Accurate Application Characterization for Exascale (APEX)

    Hammond, Simon David [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Sandia National Laboratories has been engaged in hardware and software codesign activities for a number of years, indeed, it might be argued that prototyping of clusters as far back as the CPLANT machines and many large capability resources including ASCI Red and RedStorm were examples of codesigned solutions. As the research supporting our codesign activities has moved closer to investigating on-node runtime behavior a nature hunger has grown for detailed analysis of both hardware and algorithm performance from the perspective of low-level operations. The Application Characterization for Exascale (APEX) LDRD was a project concieved of addressing some of these concerns. Primarily the research was to intended to focus on generating accurate and reproducible low-level performance metrics using tools that could scale to production-class code bases. Along side this research was an advocacy and analysis role associated with evaluating tools for production use, working with leading industry vendors to develop and refine solutions required by our code teams and to directly engage with production code developers to form a context for the application analysis and a bridge to the research community within Sandia. On each of these accounts significant progress has been made, particularly, as this report will cover, in the low-level analysis of operations for important classes of algorithms. This report summarizes the development of a collection of tools under the APEX research program and leaves to other SAND and L2 milestone reports the description of codesign progress with Sandia’s production users/developers.

  1. Accurate hydrocarbon estimates attained with radioactive isotope

    To make accurate economic evaluations of new discoveries, an oil company needs to know how much gas and oil a reservoir contains. The porous rocks of these reservoirs are not completely filled with gas or oil, but contain a mixture of gas, oil and water. It is extremely important to know what volume percentage of this water--called connate water--is contained in the reservoir rock. The percentage of connate water can be calculated from electrical resistivity measurements made downhole. The accuracy of this method can be improved if a pure sample of connate water can be analyzed or if the chemistry of the water can be determined by conventional logging methods. Because of the similarity of the mud filtrate--the water in a water-based drilling fluid--and the connate water, this is not always possible. If the oil company cannot distinguish between connate water and mud filtrate, its oil-in-place calculations could be incorrect by ten percent or more. It is clear that unless an oil company can be sure that a sample of connate water is pure, or at the very least knows exactly how much mud filtrate it contains, its assessment of the reservoir's water content--and consequently its oil or gas content--will be distorted. The oil companies have opted for the Repeat Formation Tester (RFT) method. Label the drilling fluid with small doses of tritium--a radioactive isotope of hydrogen--and it will be easy to detect and quantify in the sample

  2. Towards a more accurate concept of fuels

    Full text: The introduction of LEU in Atucha and the approval of CARA show an advancement of the Argentine power stations fuels, which stimulate and show a direction to follow. In the first case, the use of enriched U fuel relax an important restriction related to neutronic economy; that means that it is possible to design less penalized fuels using more Zry. The second case allows a decrease in the lineal power of the rods, enabling a better performance of the fuel in normal and also in accident conditions. In this work we wish to emphasize this last point, trying to find a design in which the surface power of the rod is diminished. Hence, in accident conditions owing to lack of coolant, the cladding tube will not reach temperatures that will produce oxidation, with the corresponding H2 formation and with plasticity enough to form blisters, which will obstruct the reflooding and hydration that will produce fragility and rupture of the cladding tube, with the corresponding radioactive material dispersion. This work is oriented to find rods designs with quasi rectangular geometry to lower the surface power of the rods, in order to obtain a lower central temperature of the rod. Thus, critical temperatures will not be reached in case of lack of coolant. This design is becoming a reality after PPFAE's efforts in search of cladding tubes fabrication with different circumferential values, rectangular in particular. This geometry, with an appropriate pellet design, can minimize the pellet-cladding interaction and, through the accurate width election, non rectified pellets could be used. This means an important economy in pellets production, as well as an advance in the fabrication of fuels in gloves box and hot cells in the future. The sequence to determine critical geometrical parameters is described and some rod dispositions are explored

  3. Accurate orbit propagation with planetary close encounters

    Baù, Giulio; Milani Comparetti, Andrea; Guerra, Francesca

    2015-08-01

    We tackle the problem of accurately propagating the motion of those small bodies that undergo close approaches with a planet. The literature is lacking on this topic and the reliability of the numerical results is not sufficiently discussed. The high-frequency components of the perturbation generated by a close encounter makes the propagation particularly challenging both from the point of view of the dynamical stability of the formulation and the numerical stability of the integrator. In our approach a fixed step-size and order multistep integrator is combined with a regularized formulation of the perturbed two-body problem. When the propagated object enters the region of influence of a celestial body, the latter becomes the new primary body of attraction. Moreover, the formulation and the step-size will also be changed if necessary. We present: 1) the restarter procedure applied to the multistep integrator whenever the primary body is changed; 2) new analytical formulae for setting the step-size (given the order of the multistep, formulation and initial osculating orbit) in order to control the accumulation of the local truncation error and guarantee the numerical stability during the propagation; 3) a new definition of the region of influence in the phase space. We test the propagator with some real asteroids subject to the gravitational attraction of the planets, the Yarkovsky and relativistic perturbations. Our goal is to show that the proposed approach improves the performance of both the propagator implemented in the OrbFit software package (which is currently used by the NEODyS service) and of the propagator represented by a variable step-size and order multistep method combined with Cowell's formulation (i.e. direct integration of position and velocity in either the physical or a fictitious time).

  4. Fast, accurate standardless XRF analysis with IQ+

    Full text: Due to both chemical and physical effects, the most accurate XRF data are derived from calibrations set up using in-type standards, necessitating some prior knowledge of the samples being analysed. Whilst this is often the case for routine samples, particularly in production control, for completely unknown samples the identification and availability of in-type standards can be problematic. Under these circumstances standardless analysis can offer a viable solution. Successful analysis of completely unknown samples requires a complete chemical overview of the speciemen together with the flexibility of a fundamental parameters (FP) algorithm to handle wide-ranging compositions. Although FP algorithms are improving all the time, most still require set-up samples to define the spectrometer response to a particular element. Whilst such materials may be referred to as standards, the emphasis in this kind of analysis is that only a single calibration point is required per element and that the standard chosen does not have to be in-type. The high sensitivities of modern XRF spectrometers, together with recent developments in detector counting electronics that possess a large dynamic range and high-speed data processing capacity bring significant advances to fast, standardless analysis. Illustrated with a tantalite-columbite heavy-mineral concentrate grading use-case, this paper will present the philosophy behind the semi-quantitative IQ+ software and the required hardware. This combination can give a rapid scan-based overview and quantification of the sample in less than two minutes, together with the ability to define channels for specific elements of interest where higher accuracy and lower levels of quantification are required. The accuracy, precision and limitations of standardless analysis will be assessed using certified reference materials of widely differing chemical and physical composition. Copyright (2002) Australian X-ray Analytical Association Inc

  5. An accurate and efficient experimental approach for characterization of the complex oral microbiota

    Zheng, Wei; Tsompana, Maria; Ruscitto, Angela; Sharma, Ashu; Genco, Robert; Sun, Yijun; Buck, Michael J.

    2015-01-01

    Background Currently, taxonomic interrogation of microbiota is based on amplification of 16S rRNA gene sequences in clinical and scientific settings. Accurate evaluation of the microbiota depends heavily on the primers used, and genus/species resolution bias can arise with amplification of non-representative genomic regions. The latest Illumina MiSeq sequencing chemistry has extended the read length to 300 bp, enabling deep profiling of large number of samples in a single paired-end reaction ...

  6. Accurate Jones Matrix of the Practical Faraday Rotator

    王林斗; 祝昇翔; 李玉峰; 邢文烈; 魏景芝

    2003-01-01

    The Jones matrix of practical Faraday rotators is often used in the engineering calculation of non-reciprocal optical field. Nevertheless, only the approximate Jones matrix of practical Faraday rotators has been presented by now. Based on the theory of polarized light, this paper presents the accurate Jones matrix of practical Faraday rotators. In addition, an experiment has been carried out to verify the validity of the accurate Jones matrix. This matrix accurately describes the optical characteristics of practical Faraday rotators, including rotation, loss and depolarization of the polarized light. The accurate Jones matrix can be used to obtain the accurate results for the practical Faraday rotator to transform the polarized light, which paves the way for the accurate analysis and calculation of practical Faraday rotators in relevant engineering applications.

  7. Selective pressures for accurate altruism targeting: evidence from digital evolution for difficult-to-test aspects of inclusive fitness theory

    Clune, Jeff; Goldsby, Heather J.; Ofria, Charles; Pennock, Robert T

    2010-01-01

    Inclusive fitness theory predicts that natural selection will favour altruist genes that are more accurate in targeting altruism only to copies of themselves. In this paper, we provide evidence from digital evolution in support of this prediction by competing multiple altruist-targeting mechanisms that vary in their accuracy in determining whether a potential target for altruism carries a copy of the altruist gene. We compete altruism-targeting mechanisms based on (i) kinship (kin targeting),...

  8. Biomimetic Approach for Accurate, Real-Time Aerodynamic Coefficients Project

    National Aeronautics and Space Administration — Aerodynamic and structural reliability and efficiency depends critically on the ability to accurately assess the aerodynamic loads and moments for each lifting...

  9. Accurate formulas for the penalty caused by interferometric crosstalk

    Rasmussen, Christian Jørgen; Liu, Fenghai; Jeppesen, Palle

    2000-01-01

    New simple formulas for the penalty caused by interferometric crosstalk in PIN receiver systems and optically preamplified receiver systems are presented. They are more accurate than existing formulas.......New simple formulas for the penalty caused by interferometric crosstalk in PIN receiver systems and optically preamplified receiver systems are presented. They are more accurate than existing formulas....

  10. A new, accurate predictive model for incident hypertension

    Völzke, Henry; Fung, Glenn; Ittermann, Till; Yu, Shipeng; Baumeister, Sebastian E; Dörr, Marcus; Lieb, Wolfgang; Völker, Uwe; Linneberg, Allan; Jørgensen, Torben; Felix, Stephan B; Rettig, Rainer; Rao, Bharat; Kroemer, Heyo K

    2013-01-01

    Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures.......Data mining represents an alternative approach to identify new predictors of multifactorial diseases. This work aimed at building an accurate predictive model for incident hypertension using data mining procedures....

  11. 78 FR 34604 - Submitting Complete and Accurate Information

    2013-06-10

    ... COMMISSION 10 CFR Part 50 Submitting Complete and Accurate Information AGENCY: Nuclear Regulatory Commission... accurate information as would a licensee or an applicant for a license.'' DATES: Submit comments by August... may submit comments by any of the following methods (unless this document describes a different...

  12. A Novel Method for Accurate Operon Predictions in All SequencedProkaryotes

    Price, Morgan N.; Huang, Katherine H.; Alm, Eric J.; Arkin, Adam P.

    2004-12-01

    We combine comparative genomic measures and the distance separating adjacent genes to predict operons in 124 completely sequenced prokaryotic genomes. Our method automatically tailors itself to each genome using sequence information alone, and thus can be applied to any prokaryote. For Escherichia coli K12 and Bacillus subtilis, our method is 85 and 83% accurate, respectively, which is similar to the accuracy of methods that use the same features but are trained on experimentally characterized transcripts. In Halobacterium NRC-1 and in Helicobacterpylori, our method correctly infers that genes in operons are separated by shorter distances than they are in E.coli, and its predictions using distance alone are more accurate than distance-only predictions trained on a database of E.coli transcripts. We use microarray data from sixphylogenetically diverse prokaryotes to show that combining intergenic distance with comparative genomic measures further improves accuracy and that our method is broadly effective. Finally, we survey operon structure across 124 genomes, and find several surprises: H.pylori has many operons, contrary to previous reports; Bacillus anthracis has an unusual number of pseudogenes within conserved operons; and Synechocystis PCC6803 has many operons even though it has unusually wide spacings between conserved adjacent genes.

  13. Selective pressures for accurate altruism targeting: evidence from digital evolution for difficult-to-test aspects of inclusive fitness theory.

    Clune, Jeff; Goldsby, Heather J; Ofria, Charles; Pennock, Robert T

    2011-03-01

    Inclusive fitness theory predicts that natural selection will favour altruist genes that are more accurate in targeting altruism only to copies of themselves. In this paper, we provide evidence from digital evolution in support of this prediction by competing multiple altruist-targeting mechanisms that vary in their accuracy in determining whether a potential target for altruism carries a copy of the altruist gene. We compete altruism-targeting mechanisms based on (i) kinship (kin targeting), (ii) genetic similarity at a level greater than that expected of kin (similarity targeting), and (iii) perfect knowledge of the presence of an altruist gene (green beard targeting). Natural selection always favoured the most accurate targeting mechanism available. Our investigations also revealed that evolution did not increase the altruism level when all green beard altruists used the same phenotypic marker. The green beard altruism levels stably increased only when mutations that changed the altruism level also changed the marker (e.g. beard colour), such that beard colour reliably indicated the altruism level. For kin- and similarity-targeting mechanisms, we found that evolution was able to stably adjust altruism levels. Our results confirm that natural selection favours altruist genes that are increasingly accurate in targeting altruism to only their copies. Our work also emphasizes that the concept of targeting accuracy must include both the presence of an altruist gene and the level of altruism it produces. PMID:20843843

  14. A Novel PCR-Based Approach for Accurate Identification of Vibrio parahaemolyticus.

    Li, Ruichao; Chiou, Jiachi; Chan, Edward Wai-Chi; Chen, Sheng

    2016-01-01

    A PCR-based assay was developed for more accurate identification of Vibrio parahaemolyticus through targeting the bla CARB-17 like element, an intrinsic β-lactamase gene that may also be regarded as a novel species-specific genetic marker of this organism. Homologous analysis showed that bla CARB-17 like genes were more conservative than the tlh, toxR and atpA genes, the genetic markers commonly used as detection targets in identification of V. parahaemolyticus. Our data showed that this bla CARB-17-specific PCR-based detection approach consistently achieved 100% specificity, whereas PCR targeting the tlh and atpA genes occasionally produced false positive results. Furthermore, a positive result of this test is consistently associated with an intrinsic ampicillin resistance phenotype of the test organism, presumably conferred by the products of bla CARB-17 like genes. We envision that combined analysis of the unique genetic and phenotypic characteristics conferred by bla CARB-17 shall further enhance the detection specificity of this novel yet easy-to-use detection approach to a level superior to the conventional methods used in V. parahaemolyticus detection and identification. PMID:26858713

  15. Gene finding in novel genomes

    Korf Ian

    2004-05-01

    Full Text Available Abstract Background Computational gene prediction continues to be an important problem, especially for genomes with little experimental data. Results I introduce the SNAP gene finder which has been designed to be easily adaptable to a variety of genomes. In novel genomes without an appropriate gene finder, I demonstrate that employing a foreign gene finder can produce highly inaccurate results, and that the most compatible parameters may not come from the nearest phylogenetic neighbor. I find that foreign gene finders are more usefully employed to bootstrap parameter estimation and that the resulting parameters can be highly accurate. Conclusion Since gene prediction is sensitive to species-specific parameters, every genome needs a dedicated gene finder.

  16. Accurately bearing measurement in non-cooperative passive location system

    The system of non-cooperative passive location based on array is proposed. In the system, target is detected by beamforming and Doppler matched filtering; and bearing is measured by a long-base-ling interferometer which is composed of long distance sub-arrays. For the interferometer with long-base-line, the bearing is measured accurately but ambiguously. To realize unambiguous accurately bearing measurement, beam width and multiple constraint adoptive beamforming technique is used to resolve azimuth ambiguous. Theory and simulation result shows this method is effective to realize accurately bearing measurement in no-cooperate passive location system. (authors)

  17. Accurate calculation of diffraction-limited encircled and ensquared energy.

    Andersen, Torben B

    2015-09-01

    Mathematical properties of the encircled and ensquared energy functions for the diffraction-limited point-spread function (PSF) are presented. These include power series and a set of linear differential equations that facilitate the accurate calculation of these functions. Asymptotic expressions are derived that provide very accurate estimates for the relative amount of energy in the diffraction PSF that fall outside a square or rectangular large detector. Tables with accurate values of the encircled and ensquared energy functions are also presented. PMID:26368873

  18. RSEM: accurate transcript quantification from RNA-Seq data with or without a reference genome

    Dewey Colin N

    2011-08-01

    Full Text Available Abstract Background RNA-Seq is revolutionizing the way transcript abundances are measured. A key challenge in transcript quantification from RNA-Seq data is the handling of reads that map to multiple genes or isoforms. This issue is particularly important for quantification with de novo transcriptome assemblies in the absence of sequenced genomes, as it is difficult to determine which transcripts are isoforms of the same gene. A second significant issue is the design of RNA-Seq experiments, in terms of the number of reads, read length, and whether reads come from one or both ends of cDNA fragments. Results We present RSEM, an user-friendly software package for quantifying gene and isoform abundances from single-end or paired-end RNA-Seq data. RSEM outputs abundance estimates, 95% credibility intervals, and visualization files and can also simulate RNA-Seq data. In contrast to other existing tools, the software does not require a reference genome. Thus, in combination with a de novo transcriptome assembler, RSEM enables accurate transcript quantification for species without sequenced genomes. On simulated and real data sets, RSEM has superior or comparable performance to quantification methods that rely on a reference genome. Taking advantage of RSEM's ability to effectively use ambiguously-mapping reads, we show that accurate gene-level abundance estimates are best obtained with large numbers of short single-end reads. On the other hand, estimates of the relative frequencies of isoforms within single genes may be improved through the use of paired-end reads, depending on the number of possible splice forms for each gene. Conclusions RSEM is an accurate and user-friendly software tool for quantifying transcript abundances from RNA-Seq data. As it does not rely on the existence of a reference genome, it is particularly useful for quantification with de novo transcriptome assemblies. In addition, RSEM has enabled valuable guidance for cost

  19. Accurate wall thickness measurement using autointerference of circumferential Lamb wave

    In this paper, a method of accurately measuring the pipe wall thickness by using noncontact air-coupled ultrasonic transducer (NAUT) was presented. In this method, accurate measurement of angular wave number (AWN) is a key technique because the AWN is changes minutely with the wall thickness. An autointerference of the circumferential (C-) Lamb wave was used for accurate measurements of the AWN. Principle of the method was first explained. Modified method for measuring the wall thickness near a butt weld line was also proposed and its accuracy was evaluated within 6 μm error. It was also shown in the paper that wall thickness measurement was accurately carried out beyond the difference among the sensors by calibrating the frequency response of the sensors. (author)

  20. Accurate backgrounds to Higgs production at the LHC

    Kauer, N

    2007-01-01

    Corrections of 10-30% for backgrounds to the H --> WW --> l^+l^-\\sla{p}_T search in vector boson and gluon fusion at the LHC are reviewed to make the case for precise and accurate theoretical background predictions.

  1. ACCURATE ESTIMATES OF CHARACTERISTIC EXPONENTS FOR SECOND ORDER DIFFERENTIAL EQUATION

    2009-01-01

    In this paper, a second order linear differential equation is considered, and an accurate estimate method of characteristic exponent for it is presented. Finally, we give some examples to verify the feasibility of our result.

  2. Highly Accurate Sensor for High-Purity Oxygen Determination Project

    National Aeronautics and Space Administration — In this STTR effort, Los Gatos Research (LGR) and the University of Wisconsin (UW) propose to develop a highly-accurate sensor for high-purity oxygen determination....

  3. Accurately measuring dynamic coefficient of friction in ultraform finishing

    Briggs, Dennis; Echaves, Samantha; Pidgeon, Brendan; Travis, Nathan; Ellis, Jonathan D.

    2013-09-01

    UltraForm Finishing (UFF) is a deterministic sub-aperture computer numerically controlled grinding and polishing platform designed by OptiPro Systems. UFF is used to grind and polish a variety of optics from simple spherical to fully freeform, and numerous materials from glasses to optical ceramics. The UFF system consists of an abrasive belt around a compliant wheel that rotates and contacts the part to remove material. This work aims to accurately measure the dynamic coefficient of friction (μ), how it changes as a function of belt wear, and how this ultimately affects material removal rates. The coefficient of friction has been examined in terms of contact mechanics and Preston's equation to determine accurate material removal rates. By accurately predicting changes in μ, polishing iterations can be more accurately predicted, reducing the total number of iterations required to meet specifications. We have established an experimental apparatus that can accurately measure μ by measuring triaxial forces during translating loading conditions or while manufacturing the removal spots used to calculate material removal rates. Using this system, we will demonstrate μ measurements for UFF belts during different states of their lifecycle and assess the material removal function from spot diagrams as a function of wear. Ultimately, we will use this system for qualifying belt-wheel-material combinations to develop a spot-morphing model to better predict instantaneous material removal functions.

  4. Accurate level set method for simulations of liquid atomization☆

    Changxiao Shao; Kun Luo; Jianshan Yang; Song Chen; Jianren Fan

    2015-01-01

    Computational fluid dynamics is an efficient numerical approach for spray atomization study, but it is chal enging to accurately capture the gas–liquid interface. In this work, an accurate conservative level set method is intro-duced to accurately track the gas–liquid interfaces in liquid atomization. To validate the capability of this method, binary drop collision and drop impacting on liquid film are investigated. The results are in good agreement with experiment observations. In addition, primary atomization (swirling sheet atomization) is studied using this method. To the swirling sheet atomization, it is found that Rayleigh–Taylor instability in the azimuthal direction causes the primary breakup of liquid sheet and complex vortex structures are clustered around the rim of the liq-uid sheet. The effects of central gas velocity and liquid–gas density ratio on atomization are also investigated. This work lays a solid foundation for further studying the mechanism of spray atomization.

  5. Simple and accurate analytical calculation of shortest path lengths

    Melnik, Sergey

    2016-01-01

    We present an analytical approach to calculating the distribution of shortest paths lengths (also called intervertex distances, or geodesic paths) between nodes in unweighted undirected networks. We obtain very accurate results for synthetic random networks with specified degree distribution (the so-called configuration model networks). Our method allows us to accurately predict the distribution of shortest path lengths on real-world networks using their degree distribution, or joint degree-degree distribution. Compared to some other methods, our approach is simpler and yields more accurate results. In order to obtain the analytical results, we use the analogy between an infection reaching a node in $n$ discrete time steps (i.e., as in the susceptible-infected epidemic model) and that node being at a distance $n$ from the source of the infection.

  6. Accurate and Simple Calibration of DLP Projector Systems

    Wilm, Jakob; Olesen, Oline Vinter; Larsen, Rasmus

    2014-01-01

    Much work has been devoted to the calibration of optical cameras, and accurate and simple methods are now available which require only a small number of calibration targets. The problem of obtaining these parameters for light projectors has not been studied as extensively and most current methods...... require a camera and involve feature extraction from a known projected pattern. In this work we present a novel calibration technique for DLP Projector systems based on phase shifting profilometry projection onto a printed calibration target. In contrast to most current methods, the one presented here...... does not rely on an initial camera calibration, and so does not carry over the error into projector calibration. A radial interpolation scheme is used to convert features coordinates into projector space, thereby allowing for a very accurate procedure. This allows for highly accurate determination of...

  7. Accurate nuclear radii and binding energies from a chiral interaction

    Ekstrom, A; Wendt, K A; Hagen, G; Papenbrock, T; Carlsson, B D; Forssen, C; Hjorth-Jensen, M; Navratil, P; Nazarewicz, W

    2015-01-01

    The accurate reproduction of nuclear radii and binding energies is a long-standing challenge in nuclear theory. To address this problem two-nucleon and three-nucleon forces from chiral effective field theory are optimized simultaneously to low-energy nucleon-nucleon scattering data, as well as binding energies and radii of few-nucleon systems and selected isotopes of carbon and oxygen. Coupled-cluster calculations based on this interaction, named NNLOsat, yield accurate binding energies and radii of nuclei up to 40Ca, and are consistent with the empirical saturation point of symmetric nuclear matter. In addition, the low-lying collective 3- states in 16O and 40Ca are described accurately, while spectra for selected p- and sd-shell nuclei are in reasonable agreement with experiment.

  8. Accurate Measurement of the Relative Abundance of Different DNA Species in Complex DNA Mixtures

    Jeong, Sangkyun; Yu, Hyunjoo; Pfeifer, Karl

    2012-01-01

    A molecular tool that can compare the abundances of different DNA sequences is necessary for comparing intergenic or interspecific gene expression. We devised and verified such a tool using a quantitative competitive polymerase chain reaction approach. For this approach, we adapted a competitor array, an artificially made plasmid DNA in which all the competitor templates for the target DNAs are arranged with a defined ratio, and melting analysis for allele quantitation for accurate quantitation of the fractional ratios of competitively amplified DNAs. Assays on two sets of DNA mixtures with explicitly known compositional structures of the test sequences were performed. The resultant average relative errors of 0.059 and 0.021 emphasize the highly accurate nature of this method. Furthermore, the method's capability of obtaining biological data is demonstrated by the fact that it can illustrate the tissue-specific quantitative expression signatures of the three housekeeping genes G6pdx, Ubc, and Rps27 by using the forms of the relative abundances of their transcripts, and the differential preferences of Igf2 enhancers for each of the multiple Igf2 promoters for the transcription. PMID:22334570

  9. Accurate upwind-monotone (nonoscillatory) methods for conservation laws

    Huynh, Hung T.

    1992-01-01

    The well known MUSCL scheme of Van Leer is constructed using a piecewise linear approximation. The MUSCL scheme is second order accurate at the smooth part of the solution except at extrema where the accuracy degenerates to first order due to the monotonicity constraint. To construct accurate schemes which are free from oscillations, the author introduces the concept of upwind monotonicity. Several classes of schemes, which are upwind monotone and of uniform second or third order accuracy are then presented. Results for advection with constant speed are shown. It is also shown that the new scheme compares favorably with state of the art methods.

  10. Equivalent method for accurate solution to linear interval equations

    王冲; 邱志平

    2013-01-01

    Based on linear interval equations, an accurate interval finite element method for solving structural static problems with uncertain parameters in terms of optimization is discussed. On the premise of ensuring the consistency of solution sets, the original interval equations are equivalently transformed into some deterministic inequations. On this basis, calculating the structural displacement response with interval parameters is predigested to a number of deterministic linear optimization problems. The results are proved to be accurate to the interval governing equations. Finally, a numerical example is given to demonstrate the feasibility and efficiency of the proposed method.

  11. A COMPARISON STUDY OF DIFFERENT KERNEL FUNCTIONS FOR SVM-BASED CLASSIFICATION OF MULTI-TEMPORAL POLARIMETRY SAR DATA

    B. Yekkehkhany

    2014-10-01

    Full Text Available In this paper, a framework is developed based on Support Vector Machines (SVM for crop classification using polarimetric features extracted from multi-temporal Synthetic Aperture Radar (SAR imageries. The multi-temporal integration of data not only improves the overall retrieval accuracy but also provides more reliable estimates with respect to single-date data. Several kernel functions are employed and compared in this study for mapping the input space to higher Hilbert dimension space. These kernel functions include linear, polynomials and Radial Based Function (RBF. The method is applied to several UAVSAR L-band SAR images acquired over an agricultural area near Winnipeg, Manitoba, Canada. In this research, the temporal alpha features of H/A/α decomposition method are used in classification. The experimental tests show an SVM classifier with RBF kernel for three dates of data increases the Overall Accuracy (OA to up to 3% in comparison to using linear kernel function, and up to 1% in comparison to a 3rd degree polynomial kernel function.

  12. Detection of subsurface metallic utilities by means of a SAP technique: Comparing MUSIC- and SVM-based approaches

    Meschino, Simone; Pajewski, Lara; Pastorino, Matteo; Randazzo, Andrea; Schettini, Giuseppe

    2013-10-01

    The identification of buried cables, pipes, conduits, and other cylindrical utilities is a very important task in civil engineering. In the last years, several methods have been proposed in the literature for tackling this problem. Most commonly employed approaches are based on the use of Ground Penetrating Radars, i.e., they extract the needed information about the unknown scenario starting from the electromagnetic field collected by a set of antennas. In the present paper, a statistical method, based on the use of smart antenna techniques, is used for the localization of a single buried object. In particular, two efficient algorithms for the estimation of the directions of arrival of the electromagnetic waves scattered by the targets, namely the MUltiple SIgnal Classification and the Support Vector Regression, are considered and their performances are compared.

  13. Classification of Convective and Stratiform Cells in Meteorological Radar Images Using SVM Based on a Textural Analysis

    Abdenasser Djafri; Boualem Haddad

    2014-01-01

    This contribution deals with the discrimination between stratiform and convective cells in meteorological radar images. This study is based on a textural analysis of the latter and their classification using a support vector machine (SVM). First, we apply different textural parameters such as energy, entropy, inertia, and local homogeneity. Through this experience, we identify the different textural features of both the stratiform and convective cells. Then, we use an SVM to find the best discriminating parameter between the two types of clouds. The main goal of this work is to better apply the Palmer and Marshall Z-R relations specific to each type of precipitation.

  14. SVM-Based Classification of Segmented Airborne LiDAR Point Clouds in Urban Areas

    Xiaogang Ning; Xiangguo Lin; Jixian Zhang

    2013-01-01

    Object-based point cloud analysis (OBPA) is useful for information extraction from airborne LiDAR point clouds. An object-based classification method is proposed for classifying the airborne LiDAR point clouds in urban areas herein. In the process of classification, the surface growing algorithm is employed to make clustering of the point clouds without outliers, thirteen features of the geometry, radiometry, topology and echo characteristics are calculated, a support vector machine (SVM) is ...

  15. SVM-based prediction of propeptide cleavage sites in spider toxins identifies toxin innovation in an Australian tarantula.

    Emily S W Wong

    Full Text Available Spider neurotoxins are commonly used as pharmacological tools and are a popular source of novel compounds with therapeutic and agrochemical potential. Since venom peptides are inherently toxic, the host spider must employ strategies to avoid adverse effects prior to venom use. It is partly for this reason that most spider toxins encode a protective proregion that upon enzymatic cleavage is excised from the mature peptide. In order to identify the mature toxin sequence directly from toxin transcripts, without resorting to protein sequencing, the propeptide cleavage site in the toxin precursor must be predicted bioinformatically. We evaluated different machine learning strategies (support vector machines, hidden Markov model and decision tree and developed an algorithm (SpiderP for prediction of propeptide cleavage sites in spider toxins. Our strategy uses a support vector machine (SVM framework that combines both local and global sequence information. Our method is superior or comparable to current tools for prediction of propeptide sequences in spider toxins. Evaluation of the SVM method on an independent test set of known toxin sequences yielded 96% sensitivity and 100% specificity. Furthermore, we sequenced five novel peptides (not used to train the final predictor from the venom of the Australian tarantula Selenotypus plumipes to test the accuracy of the predictor and found 80% sensitivity and 99.6% 8-mer specificity. Finally, we used the predictor together with homology information to predict and characterize seven groups of novel toxins from the deeply sequenced venom gland transcriptome of S. plumipes, which revealed structural complexity and innovations in the evolution of the toxins. The precursor prediction tool (SpiderP is freely available on ArachnoServer (http://www.arachnoserver.org/spiderP.html, a web portal to a comprehensive relational database of spider toxins. All training data, test data, and scripts used are available from the SpiderP website.

  16. On the Use of Time–Frequency Reassignment and SVM-Based Classifier for Audio Surveillance Applications

    Souli S. Sameh

    2014-11-01

    Full Text Available In this paper, we propose a robust environmental sound spectrogram classification approach. Its purpose is surveillance and security applications based on the reassignment method and log-Gabor filters. Besides, the reassignment method is applied to the spectrogram to improve the readability of the time-frequency representation, and to assure a better localization of the signal components. Our approach includes three methods. In the first two methods, the reassigned spectrograms are passed through appropriate log-Gabor filter banks and the outputs are averaged and underwent an optimal feature selection procedure based on a mutual information criterion. The third method uses the same steps but applied only to three patches extracted from each reassigned spectrogram. The proposed approach is tested on a large database consists of 1000 sounds belonging to ten classes. The recognition is based on Multiclass Support Vector Machines.

  17. DTC-SVM Based on PI Torque and PI Flux Controllers to Achieve High Performance of Induction Motor

    Hassan Farhan Rashag

    2014-01-01

    Full Text Available The fundamental idea of direct torque control of induction machines is investigated in order to emphasize the property produced by a given voltage vector on stator flux and torque variations. The proposed control system is based on Space Vector Modulation (SVM of electrical machines, Improvement model reference adaptive system, real time of stator resistance and estimation of stator flux. The purpose of this control is to minimize electromagnetic torque and flux ripple and minimizing distortion of stator current. In this proposed method, PI torque and PI flux controller are designed to achieve estimated torque and flux with good tracking and fast response with reference torque and there is no steady state error. In addition, design of PI torque and PI flux controller are used to optimize voltages in d-q reference frame that applied to SVM. The simulation Results of proposed DTC-SVM have complete excellent performance in steady and transient states as compared with classical DTC-SVM.

  18. Binary classification SVM-based algorithms with interval-valued training data using triangular and Epanechnikov kernels.

    Utkin, Lev V; Chekh, Anatoly I; Zhuk, Yulia A

    2016-08-01

    Classification algorithms based on different forms of support vector machines (SVMs) for dealing with interval-valued training data are proposed in the paper. L2-norm and L∞-norm SVMs are used for constructing the algorithms. The main idea allowing us to represent the complex optimization problems as a set of simple linear or quadratic programming problems is to approximate the Gaussian kernel by the well-known triangular and Epanechnikov kernels. The minimax strategy is used to choose an optimal probability distribution from the set and to construct optimal separating functions. Numerical experiments illustrate the algorithms. PMID:27179616

  19. Full-polarization radar remote sensing and data mining for tropical crops mapping: a successful SVM-based classification model

    Denize, J.; Corgne, S.; Todoroff, P.; LE Mezo, L.

    2015-12-01

    In Reunion, a tropical island of 2,512 km², 700 km east of Madagascar in the Indian Ocean, constrained by a rugged relief, agricultural sectors are competing in highly fragmented agricultural land constituted by heterogeneous farming systems from corporate to small-scale farming. Policymakers, planners and institutions are in dire need of reliable and updated land use references. Actually conventional land use mapping methods are inefficient under the tropic with frequent cloud cover and loosely synchronous vegetative cycles of the crops due to a constant temperature. This study aims to provide an appropriate method for the identification and mapping of tropical crops by remote sensing. For this purpose, we assess the potential of polarimetric SAR imagery associated with associated with machine learning algorithms. The method has been developed and tested on a study area of 25*25 km thanks to 6 RADARSAT-2 images in 2014 in full-polarization. A set of radar indicators (backscatter coefficient, bands ratios, indices, polarimetric decompositions (Freeman-Durden, Van zyl, Yamaguchi, Cloude and Pottier, Krogager), texture, etc.) was calculated from the coherency matrix. A random forest procedure allowed the selection of the most important variables on each images to reduce the dimension of the dataset and the processing time. Support Vector Machines (SVM), allowed the classification of these indicators based on a learning database created from field observations in 2013. The method shows an overall accuracy of 88% with a Kappa index of 0.82 for the identification of four major crops.

  20. SVM-based prediction of propeptide cleavage sites in spider toxins identifies toxin innovation in an Australian tarantula.

    Wong, Emily S W; Hardy, Margaret C; Wood, David; Bailey, Timothy; King, Glenn F

    2013-01-01

    Spider neurotoxins are commonly used as pharmacological tools and are a popular source of novel compounds with therapeutic and agrochemical potential. Since venom peptides are inherently toxic, the host spider must employ strategies to avoid adverse effects prior to venom use. It is partly for this reason that most spider toxins encode a protective proregion that upon enzymatic cleavage is excised from the mature peptide. In order to identify the mature toxin sequence directly from toxin transcripts, without resorting to protein sequencing, the propeptide cleavage site in the toxin precursor must be predicted bioinformatically. We evaluated different machine learning strategies (support vector machines, hidden Markov model and decision tree) and developed an algorithm (SpiderP) for prediction of propeptide cleavage sites in spider toxins. Our strategy uses a support vector machine (SVM) framework that combines both local and global sequence information. Our method is superior or comparable to current tools for prediction of propeptide sequences in spider toxins. Evaluation of the SVM method on an independent test set of known toxin sequences yielded 96% sensitivity and 100% specificity. Furthermore, we sequenced five novel peptides (not used to train the final predictor) from the venom of the Australian tarantula Selenotypus plumipes to test the accuracy of the predictor and found 80% sensitivity and 99.6% 8-mer specificity. Finally, we used the predictor together with homology information to predict and characterize seven groups of novel toxins from the deeply sequenced venom gland transcriptome of S. plumipes, which revealed structural complexity and innovations in the evolution of the toxins. The precursor prediction tool (SpiderP) is freely available on ArachnoServer (http://www.arachnoserver.org/spiderP.html), a web portal to a comprehensive relational database of spider toxins. All training data, test data, and scripts used are available from the SpiderP website. PMID:23894279

  1. SVM-Based Prediction of Propeptide Cleavage Sites in Spider Toxins Identifies Toxin Innovation in an Australian Tarantula

    Wong, Emily S. W.; Hardy, Margaret C.; David Wood; Timothy Bailey; Glenn F. King

    2013-01-01

    Spider neurotoxins are commonly used as pharmacological tools and are a popular source of novel compounds with therapeutic and agrochemical potential. Since venom peptides are inherently toxic, the host spider must employ strategies to avoid adverse effects prior to venom use. It is partly for this reason that most spider toxins encode a protective proregion that upon enzymatic cleavage is excised from the mature peptide. In order to identify the mature toxin sequence directly from toxin tran...

  2. Accurate Period Approximation for Any Simple Pendulum Amplitude

    XUE De-Sheng; ZHOU Zhao; GAO Mei-Zhen

    2012-01-01

    Accurate approximate analytical formulae of the pendulum period composed of a few elementary functions for any amplitude are constructed.Based on an approximation of the elliptic integral,two new logarithmic formulae for large amplitude close to 180° are obtained.Considering the trigonometric function modulation results from the dependence of relative error on the amplitude,we realize accurate approximation period expressions for any amplitude between 0 and 180°.A relative error less than 0.02% is achieved for any amplitude.This kind of modulation is also effective for other large-amplitude logarithmic approximation expressions.%Accurate approximate analytical formulae of the pendulum period composed of a few elementary functions for any amplitude are constructed. Based on an approximation of the elliptic integral, two new logarithmic formulae for large amplitude close to 180° are obtained. Considering the trigonometric function modulation results from the dependence of relative error on the amplitude, we realize accurate approximation period expressions for any amplitude between 0 and 180°. A relative error less than 0.02% is achieved for any amplitude. This kind of modulation is also effective for other large-amplitude logarithmic approximation expressions.

  3. Is a Writing Sample Necessary for "Accurate Placement"?

    Sullivan, Patrick; Nielsen, David

    2009-01-01

    The scholarship about assessment for placement is extensive and notoriously ambiguous. Foremost among the questions that continue to be unresolved in this scholarship is this one: Is a writing sample necessary for "accurate placement"? Using a robust data sample of student assessment essays and ACCUPLACER test scores, we put this question to the…

  4. Accurately Detecting Students' Lies regarding Relational Aggression by Correctional Instructions

    Dickhauser, Oliver; Reinhard, Marc-Andre; Marksteiner, Tamara

    2012-01-01

    This study investigates the effect of correctional instructions when detecting lies about relational aggression. Based on models from the field of social psychology, we predict that correctional instruction will lead to a less pronounced lie bias and to more accurate lie detection. Seventy-five teachers received videotapes of students' true denial…

  5. Fast and Accurate Residential Fire Detection Using Wireless Sensor Networks

    Bahrepour, Majid; Meratnia, Nirvana; Havinga, Paul J.M.

    2010-01-01

    Prompt and accurate residential fire detection is important for on-time fire extinguishing and consequently reducing damages and life losses. To detect fire sensors are needed to measure the environmental parameters and algorithms are required to decide about occurrence of fire. Recently, wireless s

  6. Second-order accurate nonoscillatory schemes for scalar conservation laws

    Huynh, Hung T.

    1989-01-01

    Explicit finite difference schemes for the computation of weak solutions of nonlinear scalar conservation laws is presented and analyzed. These schemes are uniformly second-order accurate and nonoscillatory in the sense that the number of extrema of the discrete solution is not increasing in time.

  7. Is Expressive Language Disorder an Accurate Diagnostic Category?

    Leonard, Laurence B.

    2009-01-01

    Purpose: To propose that the diagnostic category of "expressive language disorder" as distinct from a disorder of both expressive and receptive language might not be accurate. Method: Evidence that casts doubt on a pure form of this disorder is reviewed from several sources, including the literature on genetic findings, theories of language…

  8. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, S. A., E-mail: Sergey.Khrapak@dlr.de [Forschungsgruppe Komplexe Plasmen, Deutsches Zentrum für Luft- und Raumfahrt, Oberpfaffenhofen (Germany)

    2014-04-15

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within ±2% in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  9. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, Sergey

    2014-01-01

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within 2% in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  10. Accurate momentum transfer cross section for the attractive Yukawa potential

    Khrapak, S. A.

    2014-01-01

    Accurate expression for the momentum transfer cross section for the attractive Yukawa potential is proposed. This simple analytic expression agrees with the numerical results better than to within $\\pm 2\\%$ in the regime relevant for ion-particle collisions in complex (dusty) plasmas.

  11. Accurate segmentation of dense nanoparticles by partially discrete electron tomography

    Roelandts, T., E-mail: tom.roelandts@ua.ac.be [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Batenburg, K.J. [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Centrum Wiskunde and Informatica, Science Park 123, 1098 XG Amsterdam (Netherlands); Biermans, E. [EMAT, University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Kuebel, C. [Institute of Nanotechnology, Karlsruhe Institute of Technology, Hermann-von-Helmholtz-Platz 1, 76344 Eggenstein-Leopoldshafen (Germany); Bals, S. [EMAT, University of Antwerp, Groenenborgerlaan 171, 2020 Antwerp (Belgium); Sijbers, J. [IBBT-Vision Lab University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium)

    2012-03-15

    Accurate segmentation of nanoparticles within various matrix materials is a difficult problem in electron tomography. Due to artifacts related to image series acquisition and reconstruction, global thresholding of reconstructions computed by established algorithms, such as weighted backprojection or SIRT, may result in unreliable and subjective segmentations. In this paper, we introduce the Partially Discrete Algebraic Reconstruction Technique (PDART) for computing accurate segmentations of dense nanoparticles of constant composition. The particles are segmented directly by the reconstruction algorithm, while the surrounding regions are reconstructed using continuously varying gray levels. As no properties are assumed for the other compositions of the sample, the technique can be applied to any sample where dense nanoparticles must be segmented, regardless of the surrounding compositions. For both experimental and simulated data, it is shown that PDART yields significantly more accurate segmentations than those obtained by optimal global thresholding of the SIRT reconstruction. -- Highlights: Black-Right-Pointing-Pointer We present a novel reconstruction method for partially discrete electron tomography. Black-Right-Pointing-Pointer It accurately segments dense nanoparticles directly during reconstruction. Black-Right-Pointing-Pointer The gray level to use for the nanoparticles is determined objectively. Black-Right-Pointing-Pointer The method expands the set of samples for which discrete tomography can be applied.

  12. Efficient and accurate sound propagation using adaptive rectangular decomposition.

    Raghuvanshi, Nikunj; Narain, Rahul; Lin, Ming C

    2009-01-01

    Accurate sound rendering can add significant realism to complement visual display in interactive applications, as well as facilitate acoustic predictions for many engineering applications, like accurate acoustic analysis for architectural design. Numerical simulation can provide this realism most naturally by modeling the underlying physics of wave propagation. However, wave simulation has traditionally posed a tough computational challenge. In this paper, we present a technique which relies on an adaptive rectangular decomposition of 3D scenes to enable efficient and accurate simulation of sound propagation in complex virtual environments. It exploits the known analytical solution of the Wave Equation in rectangular domains, and utilizes an efficient implementation of the Discrete Cosine Transform on Graphics Processors (GPU) to achieve at least a 100-fold performance gain compared to a standard Finite-Difference Time-Domain (FDTD) implementation with comparable accuracy, while also being 10-fold more memory efficient. Consequently, we are able to perform accurate numerical acoustic simulation on large, complex scenes in the kilohertz range. To the best of our knowledge, it was not previously possible to perform such simulations on a desktop computer. Our work thus enables acoustic analysis on large scenes and auditory display for complex virtual environments on commodity hardware. PMID:19590105

  13. On the importance of having accurate data for astrophysical modelling

    Lique, Francois

    2016-06-01

    The Herschel telescope and the ALMA and NOEMA interferometers have opened new windows of observation for wavelengths ranging from far infrared to sub-millimeter with spatial and spectral resolutions previously unmatched. To make the most of these observations, an accurate knowledge of the physical and chemical processes occurring in the interstellar and circumstellar media is essential.In this presentation, I will discuss what are the current needs of astrophysics in terms of molecular data and I will show that accurate molecular data are crucial for the proper determination of the physical conditions in molecular clouds.First, I will focus on collisional excitation studies that are needed for molecular lines modelling beyond the Local Thermodynamic Equilibrium (LTE) approach. In particular, I will show how new collisional data for the HCN and HNC isomers, two tracers of star forming conditions, have allowed solving the problem of their respective abundance in cold molecular clouds. I will also present the last collisional data that have been computed in order to analyse new highly resolved observations provided by the ALMA interferometer.Then, I will present the calculation of accurate rate constants for the F+H2 → HF+H and Cl+H2 ↔ HCl+H reactions, which have allowed a more accurate determination of the physical conditions in diffuse molecular clouds. I will also present the recent work on the ortho-para-H2 conversion due to hydrogen exchange that allow more accurate determination of the ortho-to-para-H2 ratio in the universe and that imply a significant revision of the cooling mechanism in astrophysical media.

  14. Occupancy by key transcription factors is a more accurate predictor of enhancer activity than histone modifications or chromatin accessibility

    Dogan, Nergiz; Wu, Weisheng; Morrissey, Christapher S.; Chen, Kuan-Bei; Stonestrom, Aaron; Long, Maria; Keller, Cheryl A.; Cheng, Yong; Jain, Deepti; Visel, Axel; Pennacchio, Len A.; Weiss, Mitchell J.; Blobel, Gerd A.; Hardison, Ross C.

    2015-01-01

    Background Regulated gene expression controls organismal development, and variation in regulatory patterns has been implicated in complex traits. Thus accurate prediction of enhancers is important for further understanding of these processes. Genome-wide measurement of epigenetic features, such as histone modifications and occupancy by transcription factors, is improving enhancer predictions, but the contribution of these features to prediction accuracy is not known. Given the importance of t...

  15. On the relation between gene flow theory and genetic gain

    Bijma, P; Woolliams, John

    2000-01-01

    In conventional gene flow theory the rate of genetic gain is calculated as the summed products of genetic selection differential and asymptotic proportion of genes deriving from sex-age groups. Recent studies have shown that asymptotic proportions of genes predicted from conventional gene flow theory may deviate considerably from true proportions. However, the rate of genetic gain predicted from conventional gene flow theory was accurate. The current note shows that the connection between asy...

  16. A simplified and accurate detection of the genetically modified wheat MON71800 with one calibrator plasmid.

    Kim, Jae-Hwan; Park, Saet-Byul; Roh, Hyo-Jeong; Park, Sunghoon; Shin, Min-Ki; Moon, Gui Im; Hong, Jin-Hwan; Kim, Hae-Yeong

    2015-06-01

    With the increasing number of genetically modified (GM) events, unauthorized GMO releases into the food market have increased dramatically, and many countries have developed detection tools for them. This study described the qualitative and quantitative detection methods of unauthorized the GM wheat MON71800 with a reference plasmid (pGEM-M71800). The wheat acetyl-CoA carboxylase (acc) gene was used as the endogenous gene. The plasmid pGEM-M71800, which contains both the acc gene and the event-specific target MON71800, was constructed as a positive control for the qualitative and quantitative analyses. The limit of detection in the qualitative PCR assay was approximately 10 copies. In the quantitative PCR assay, the standard deviation and relative standard deviation repeatability values ranged from 0.06 to 0.25 and from 0.23% to 1.12%, respectively. This study supplies a powerful and very simple but accurate detection strategy for unauthorized GM wheat MON71800 that utilizes a single calibrator plasmid. PMID:25624198

  17. Accurate Load Modeling Based on Analytic Hierarchy Process

    Zhenshu Wang

    2016-01-01

    Full Text Available Establishing an accurate load model is a critical problem in power system modeling. That has significant meaning in power system digital simulation and dynamic security analysis. The synthesis load model (SLM considers the impact of power distribution network and compensation capacitor, while randomness of power load is more precisely described by traction power system load model (TPSLM. On the basis of these two load models, a load modeling method that combines synthesis load with traction power load is proposed in this paper. This method uses analytic hierarchy process (AHP to interact with two load models. Weight coefficients of two models can be calculated after formulating criteria and judgment matrixes and then establishing a synthesis model by weight coefficients. The effectiveness of the proposed method was examined through simulation. The results show that accurate load modeling based on AHP can effectively improve the accuracy of load model and prove the validity of this method.

  18. Accurate adjoint design sensitivities for nano metal optics.

    Hansen, Paul; Hesselink, Lambertus

    2015-09-01

    We present a method for obtaining accurate numerical design sensitivities for metal-optical nanostructures. Adjoint design sensitivity analysis, long used in fluid mechanics and mechanical engineering for both optimization and structural analysis, is beginning to be used for nano-optics design, but it fails for sharp-cornered metal structures because the numerical error in electromagnetic simulations of metal structures is highest at sharp corners. These locations feature strong field enhancement and contribute strongly to design sensitivities. By using high-accuracy FEM calculations and rounding sharp features to a finite radius of curvature we obtain highly-accurate design sensitivities for 3D metal devices. To provide a bridge to the existing literature on adjoint methods in other fields, we derive the sensitivity equations for Maxwell's equations in the PDE framework widely used in fluid mechanics. PMID:26368483

  19. Method for Accurately Calibrating a Spectrometer Using Broadband Light

    Simmons, Stephen; Youngquist, Robert

    2011-01-01

    A novel method has been developed for performing very fine calibration of a spectrometer. This process is particularly useful for modern miniature charge-coupled device (CCD) spectrometers where a typical factory wavelength calibration has been performed and a finer, more accurate calibration is desired. Typically, the factory calibration is done with a spectral line source that generates light at known wavelengths, allowing specific pixels in the CCD array to be assigned wavelength values. This method is good to about 1 nm across the spectrometer s wavelength range. This new method appears to be accurate to about 0.1 nm, a factor of ten improvement. White light is passed through an unbalanced Michelson interferometer, producing an optical signal with significant spectral variation. A simple theory can be developed to describe this spectral pattern, so by comparing the actual spectrometer output against this predicted pattern, errors in the wavelength assignment made by the spectrometer can be determined.

  20. Multimodal Spatial Calibration for Accurately Registering EEG Sensor Positions

    Jianhua Zhang

    2014-01-01

    Full Text Available This paper proposes a fast and accurate calibration method to calibrate multiple multimodal sensors using a novel photogrammetry system for fast localization of EEG sensors. The EEG sensors are placed on human head and multimodal sensors are installed around the head to simultaneously obtain all EEG sensor positions. A multiple views’ calibration process is implemented to obtain the transformations of multiple views. We first develop an efficient local repair algorithm to improve the depth map, and then a special calibration body is designed. Based on them, accurate and robust calibration results can be achieved. We evaluate the proposed method by corners of a chessboard calibration plate. Experimental results demonstrate that the proposed method can achieve good performance, which can be further applied to EEG source localization applications on human brain.

  1. Accurate multireference study of Si3 electronic manifold

    Goncalves, Cayo Emilio Monteiro; Braga, Joao Pedro

    2016-01-01

    Since it has been shown that the silicon trimer has a highly multi-reference character, accurate multi-reference configuration interaction calculations are performed to elucidate its electronic manifold. Emphasis is given to the long range part of the potential, aiming to understand the atom-diatom collisions dynamical aspects, to describe conical intersections and important saddle points along the reactive path. Potential energy surface main features analysis are performed for benchmarking, and highly accurate values for structures, vibrational constants and energy gaps are reported, as well as the unpublished spin-orbit coupling magnitude. The results predict that inter-system crossings will play an important role in dynamical simulations, specially in triplet state quenching, making the problem of constructing a precise potential energy surface more complicated and multi-layer dependent. The ground state is predicted to be the singlet one, but since the singlet-triplet gap is rather small (2.448 kJ/mol) bo...

  2. Simple and High-Accurate Schemes for Hyperbolic Conservation Laws

    Renzhong Feng

    2014-01-01

    Full Text Available The paper constructs a class of simple high-accurate schemes (SHA schemes with third order approximation accuracy in both space and time to solve linear hyperbolic equations, using linear data reconstruction and Lax-Wendroff scheme. The schemes can be made even fourth order accurate with special choice of parameter. In order to avoid spurious oscillations in the vicinity of strong gradients, we make the SHA schemes total variation diminishing ones (TVD schemes for short by setting flux limiter in their numerical fluxes and then extend these schemes to solve nonlinear Burgers’ equation and Euler equations. The numerical examples show that these schemes give high order of accuracy and high resolution results. The advantages of these schemes are their simplicity and high order of accuracy.

  3. Fixed-Wing Micro Aerial Vehicle for Accurate Corridor Mapping

    Rehak, M.; Skaloud, J.

    2015-08-01

    In this study we present a Micro Aerial Vehicle (MAV) equipped with precise position and attitude sensors that together with a pre-calibrated camera enables accurate corridor mapping. The design of the platform is based on widely available model components to which we integrate an open-source autopilot, customized mass-market camera and navigation sensors. We adapt the concepts of system calibration from larger mapping platforms to MAV and evaluate them practically for their achievable accuracy. We present case studies for accurate mapping without ground control points: first for a block configuration, later for a narrow corridor. We evaluate the mapping accuracy with respect to checkpoints and digital terrain model. We show that while it is possible to achieve pixel (3-5 cm) mapping accuracy in both cases, precise aerial position control is sufficient for block configuration, the precise position and attitude control is required for corridor mapping.

  4. Accurate Development of Thermal Neutron Scattering Cross Section Libraries

    Hawari, Ayman; Dunn, Michael

    2014-06-10

    The objective of this project is to develop a holistic (fundamental and accurate) approach for generating thermal neutron scattering cross section libraries for a collection of important enutron moderators and reflectors. The primary components of this approach are the physcial accuracy and completeness of the generated data libraries. Consequently, for the first time, thermal neutron scattering cross section data libraries will be generated that are based on accurate theoretical models, that are carefully benchmarked against experimental and computational data, and that contain complete covariance information that can be used in propagating the data uncertainties through the various components of the nuclear design and execution process. To achieve this objective, computational and experimental investigations will be performed on a carefully selected subset of materials that play a key role in all stages of the nuclear fuel cycle.

  5. The FLUKA code: An accurate simulation tool for particle therapy

    Battistoni, Giuseppe; Böhlen, Till T; Cerutti, Francesco; Chin, Mary Pik Wai; Dos Santos Augusto, Ricardo M; Ferrari, Alfredo; Garcia Ortega, Pablo; Kozlowska, Wioletta S; Magro, Giuseppe; Mairani, Andrea; Parodi, Katia; Sala, Paola R; Schoofs, Philippe; Tessonnier, Thomas; Vlachoudis, Vasilis

    2016-01-01

    Monte Carlo (MC) codes are increasingly spreading in the hadrontherapy community due to their detailed description of radiation transport and interaction with matter. The suitability of a MC code for application to hadrontherapy demands accurate and reliable physical models capable of handling all components of the expected radiation field. This becomes extremely important for correctly performing not only physical but also biologically-based dose calculations, especially in cases where ions heavier than protons are involved. In addition, accurate prediction of emerging secondary radiation is of utmost importance in innovative areas of research aiming at in-vivo treatment verification. This contribution will address the recent developments of the FLUKA MC code and its practical applications in this field. Refinements of the FLUKA nuclear models in the therapeutic energy interval lead to an improved description of the mixed radiation field as shown in the presented benchmarks against experimental data with bot...

  6. Efficient and Accurate Robustness Estimation for Large Complex Networks

    Wandelt, Sebastian

    2016-01-01

    Robustness estimation is critical for the design and maintenance of resilient networks, one of the global challenges of the 21st century. Existing studies exploit network metrics to generate attack strategies, which simulate intentional attacks in a network, and compute a metric-induced robustness estimation. While some metrics are easy to compute, e.g. degree centrality, other, more accurate, metrics require considerable computation efforts, e.g. betweennes centrality. We propose a new algorithm for estimating the robustness of a network in sub-quadratic time, i.e., significantly faster than betweenness centrality. Experiments on real-world networks and random networks show that our algorithm estimates the robustness of networks close to or even better than betweenness centrality, while being orders of magnitudes faster. Our work contributes towards scalable, yet accurate methods for robustness estimation of large complex networks.

  7. Fast and Accurate Bilateral Filtering using Gauss-Polynomial Decomposition

    Chaudhury, Kunal N.

    2015-01-01

    The bilateral filter is a versatile non-linear filter that has found diverse applications in image processing, computer vision, computer graphics, and computational photography. A widely-used form of the filter is the Gaussian bilateral filter in which both the spatial and range kernels are Gaussian. A direct implementation of this filter requires $O(\\sigma^2)$ operations per pixel, where $\\sigma$ is the standard deviation of the spatial Gaussian. In this paper, we propose an accurate approxi...

  8. Accurate Insertion Loss Measurements of the Juno Patch Array Antennas

    Chamberlain, Neil; Chen, Jacqueline; Hodges, Richard; Demas, John

    2010-01-01

    This paper describes two independent methods for estimating the insertion loss of patch array antennas that were developed for the Juno Microwave Radiometer instrument. One method is based principally on pattern measurements while the other method is based solely on network analyzer measurements. The methods are accurate to within 0.1 dB for the measured antennas and show good agreement (to within 0.1dB) of separate radiometric measurements.

  9. Dejavu: An Accurate Energy-Efficient Outdoor Localization System

    Aly, Heba; Youssef, Moustafa

    2013-01-01

    We present Dejavu, a system that uses standard cell-phone sensors to provide accurate and energy-efficient outdoor localization suitable for car navigation. Our analysis shows that different road landmarks have a unique signature on cell-phone sensors; For example, going inside tunnels, moving over bumps, going up a bridge, and even potholes all affect the inertial sensors on the phone in a unique pattern. Dejavu employs a dead-reckoning localization approach and leverages these road landmark...

  10. Accurate Parameter Estimation for Unbalanced Three-Phase System

    Yuan Chen; Hing Cheung So

    2014-01-01

    Smart grid is an intelligent power generation and control console in modern electricity networks, where the unbalanced three-phase power system is the commonly used model. Here, parameter estimation for this system is addressed. After converting the three-phase waveforms into a pair of orthogonal signals via the α β-transformation, the nonlinear least squares (NLS) estimator is developed for accurately finding the frequency, phase, and voltage parameters. The estimator is realized by the Newt...

  11. Accurate, inexpensive testing of laser pointer power for safe operation

    An accurate, inexpensive test-bed for the measurement of optical power emitted from handheld lasers is described. The setup consists of a power meter, optical bandpass filters, an adjustable iris and self-centering lens mounts. We demonstrate this test-bed by evaluating the output power of 23 laser pointers with respect to the limits imposed by the US Code of Federal Regulations. We find a compliance rate of only 26%. A discussion of potential laser pointer hazards is included. (paper)

  12. DOMAC: an accurate, hybrid protein domain prediction server

    Cheng, Jianlin

    2007-01-01

    Protein domain prediction is important for protein structure prediction, structure determination, function annotation, mutagenesis analysis and protein engineering. Here we describe an accurate protein domain prediction server (DOMAC) combining both template-based and ab initio methods. The preliminary version of the server was ranked among the top domain prediction servers in the seventh edition of Critical Assessment of Techniques for Protein Structure Prediction (CASP7), 2006. DOMAC server...

  13. A multiple more accurate Hardy-Littlewood-Polya inequality

    Qiliang Huang

    2012-11-01

    Full Text Available By introducing multi-parameters and conjugate exponents and using Euler-Maclaurin’s summation formula, we estimate the weight coefficient and prove a multiple more accurate Hardy-Littlewood-Polya (H-L-P inequality, which is an extension of some earlier published results. We also prove that the constant factor in the new inequality is the best possible, and obtain its equivalent forms.

  14. Shock Emergence in Supernovae: Limiting Cases and Accurate Approximations

    Ro, Stephen

    2013-01-01

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient the solutions match the outcome of shock acceleration in exponential atmospheres.

  15. Shock Emergence in Supernovae: Limiting Cases and Accurate Approximations

    Ro, Stephen; Matzner, Christopher D.

    2013-08-01

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere, there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient, the solutions match the outcome of shock acceleration in exponential atmospheres.

  16. SHOCK EMERGENCE IN SUPERNOVAE: LIMITING CASES AND ACCURATE APPROXIMATIONS

    Ro, Stephen; Matzner, Christopher D. [Department of Astronomy and Astrophysics, University of Toronto, 50 St. George St., Toronto, ON M5S 3H4 (Canada)

    2013-08-10

    We examine the dynamics of accelerating normal shocks in stratified planar atmospheres, providing accurate fitting formulae for the scaling index relating shock velocity to the initial density and for the post-shock acceleration factor as functions of the polytropic and adiabatic indices which parameterize the problem. In the limit of a uniform initial atmosphere, there are analytical formulae for these quantities. In the opposite limit of a very steep density gradient, the solutions match the outcome of shock acceleration in exponential atmospheres.

  17. An accurate and robust gyroscope-gased pedometer.

    Lim, Yoong P; Brown, Ian T; Khoo, Joshua C T

    2008-01-01

    Pedometers are known to have steps estimation issues. This is mainly attributed to their innate acceleration based measuring sensory. A micro-machined gyroscope (better immunity to acceleration) based pedometer is proposed. Through syntactic data recognition of apriori knowledge of human shank's dynamics and temporally précised detection of heel strikes permitted by Wavelet decomposition, an accurate and robust pedometer is acquired. PMID:19163737

  18. Accurate calculation of thermal noise in multilayer coating

    Gurkovsky, Alexey; Vyatchanin, Sergey

    2010-01-01

    We derive accurate formulas for thermal fluctuations in multilayer interferometric coating taking into account light propagation inside the coating. In particular, we calculate the reflected wave phase as a function of small displacements of the boundaries between the layers using transmission line model for interferometric coating and derive formula for spectral density of reflected phase in accordance with Fluctuation-Dissipation Theorem. We apply the developed approach for calculation of t...

  19. Novel multi-beam radiometers for accurate ocean surveillance

    Cappellin, C.; Pontoppidan, K.; Nielsen, P. H.;

    2014-01-01

    Novel antenna architectures for real aperture multi-beam radiometers providing high resolution and high sensitivity for accurate sea surface temperature (SST) and ocean vector wind (OVW) measurements are investigated. On the basis of the radiometer requirements set for future SST/OVW missions, co......, conical scanners and push-broom antennas are compared. The comparison will cover reflector optics and focal plane array configuration....

  20. A novel automated image analysis method for accurate adipocyte quantification

    Osman, Osman S.; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J.; O’Dowd, Jacqueline F; Cawthorne, Michael A.; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-section...

  1. Strategy Guideline. Accurate Heating and Cooling Load Calculations

    Burdick, Arlan [IBACOS, Inc., Pittsburgh, PA (United States)

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  2. Strategy Guideline: Accurate Heating and Cooling Load Calculations

    Burdick, A.

    2011-06-01

    This guide presents the key criteria required to create accurate heating and cooling load calculations and offers examples of the implications when inaccurate adjustments are applied to the HVAC design process. The guide shows, through realistic examples, how various defaults and arbitrary safety factors can lead to significant increases in the load estimate. Emphasis is placed on the risks incurred from inaccurate adjustments or ignoring critical inputs of the load calculation.

  3. Evaluation of accurate eye corner detection methods for gaze estimation

    Bengoechea, Jose Javier; Cerrolaza, Juan J.; Villanueva, Arantxa; Cabeza, Rafael

    2014-01-01

    Accurate detection of iris center and eye corners appears to be a promising approach for low cost gaze estimation. In this paper we propose novel eye inner corner detection methods. Appearance and feature based segmentation approaches are suggested. All these methods are exhaustively tested on a realistic dataset containing images of subjects gazing at different points on a screen. We have demonstrated that a method based on a neural network presents the best performance even in light changin...

  4. Building with Drones: Accurate 3D Facade Reconstruction using MAVs

    Daftry, Shreyansh; Hoppe, Christof; Bischof, Horst

    2015-01-01

    Automatic reconstruction of 3D models from images using multi-view Structure-from-Motion methods has been one of the most fruitful outcomes of computer vision. These advances combined with the growing popularity of Micro Aerial Vehicles as an autonomous imaging platform, have made 3D vision tools ubiquitous for large number of Architecture, Engineering and Construction applications among audiences, mostly unskilled in computer vision. However, to obtain high-resolution and accurate reconstruc...

  5. Mouse models of human AML accurately predict chemotherapy response

    Zuber, Johannes; Radtke, Ina; Pardee, Timothy S.; Zhao, Zhen; Rappaport, Amy R.; Luo, Weijun; McCurrach, Mila E.; Yang, Miao-Miao; Dolan, M. Eileen; Kogan, Scott C.; Downing, James R.; Lowe, Scott W.

    2009-01-01

    The genetic heterogeneity of cancer influences the trajectory of tumor progression and may underlie clinical variation in therapy response. To model such heterogeneity, we produced genetically and pathologically accurate mouse models of common forms of human acute myeloid leukemia (AML) and developed methods to mimic standard induction chemotherapy and efficiently monitor therapy response. We see that murine AMLs harboring two common human AML genotypes show remarkably diverse responses to co...

  6. Accurate Multisteps Traffic Flow Prediction Based on SVM

    Zhang Mingheng; Zhen Yaobao; Hui Ganglong; Chen Gang

    2013-01-01

    Accurate traffic flow prediction is prerequisite and important for realizing intelligent traffic control and guidance, and it is also the objective requirement for intelligent traffic management. Due to the strong nonlinear, stochastic, time-varying characteristics of urban transport system, artificial intelligence methods such as support vector machine (SVM) are now receiving more and more attentions in this research field. Compared with the traditional single-step prediction method, the mul...

  7. Accurate calibration of stereo cameras for machine vision

    Li, Liangfu; Feng, Zuren; Feng, Yuanjing

    2004-01-01

    Camera calibration is an important task for machine vision, whose goal is to obtain the internal and external parameters of each camera. With these parameters, the 3D positions of a scene point, which is identified and matched in two stereo images, can be determined by the triangulation theory. This paper presents a new accurate estimation of CCD camera parameters for machine vision. We present a fast technique to estimate the camera center with special arrangement of calibration target and t...

  8. Calibration Techniques for Accurate Measurements by Underwater Camera Systems

    Mark Shortis

    2015-01-01

    Calibration of a camera system is essential to ensure that image measurements result in accurate estimates of locations and dimensions within the object space. In the underwater environment, the calibration must implicitly or explicitly model and compensate for the refractive effects of waterproof housings and the water medium. This paper reviews the different approaches to the calibration of underwater camera systems in theoretical and practical terms. The accuracy, reliability, validation a...

  9. Accurate Method for Determining Adhesion of Cantilever Beams

    Michalske, T.A.; de Boer, M.P.

    1999-01-08

    Using surface micromachined samples, we demonstrate the accurate measurement of cantilever beam adhesion by using test structures which are adhered over long attachment lengths. We show that this configuration has a deep energy well, such that a fracture equilibrium is easily reached. When compared to the commonly used method of determining the shortest attached beam, the present method is much less sensitive to variations in surface topography or to details of capillary drying.

  10. A robust and accurate formulation of molecular and colloidal electrostatics

    Sun, Qiang; Klaseboer, Evert; Chan, Derek Y. C.

    2016-08-01

    This paper presents a re-formulation of the boundary integral method for the Debye-Hückel model of molecular and colloidal electrostatics that removes the mathematical singularities that have to date been accepted as an intrinsic part of the conventional boundary integral equation method. The essence of the present boundary regularized integral equation formulation consists of subtracting a known solution from the conventional boundary integral method in such a way as to cancel out the singularities associated with the Green's function. This approach better reflects the non-singular physical behavior of the systems on boundaries with the benefits of the following: (i) the surface integrals can be evaluated accurately using quadrature without any need to devise special numerical integration procedures, (ii) being able to use quadratic or spline function surface elements to represent the surface more accurately and the variation of the functions within each element is represented to a consistent level of precision by appropriate interpolation functions, (iii) being able to calculate electric fields, even at boundaries, accurately and directly from the potential without having to solve hypersingular integral equations and this imparts high precision in calculating the Maxwell stress tensor and consequently, intermolecular or colloidal forces, (iv) a reliable way to handle geometric configurations in which different parts of the boundary can be very close together without being affected by numerical instabilities, therefore potentials, fields, and forces between surfaces can be found accurately at surface separations down to near contact, and (v) having the simplicity of a formulation that does not require complex algorithms to handle singularities will result in significant savings in coding effort and in the reduction of opportunities for coding errors. These advantages are illustrated using examples drawn from molecular and colloidal electrostatics.