WorldWideScience

Sample records for preprocessing step saving

  1. Evaluating the reliability of different preprocessing steps to estimate graph theoretical measures in resting state fMRI data.

    Science.gov (United States)

    Aurich, Nathassia K; Alves Filho, José O; Marques da Silva, Ana M; Franco, Alexandre R

    2015-01-01

    With resting-state functional MRI (rs-fMRI) there are a variety of post-processing methods that can be used to quantify the human brain connectome. However, there is also a choice of which preprocessing steps will be used prior to calculating the functional connectivity of the brain. In this manuscript, we have tested seven different preprocessing schemes and assessed the reliability between and reproducibility within the various strategies by means of graph theoretical measures. Different preprocessing schemes were tested on a publicly available dataset, which includes rs-fMRI data of healthy controls. The brain was parcellated into 190 nodes and four graph theoretical (GT) measures were calculated; global efficiency (GEFF), characteristic path length (CPL), average clustering coefficient (ACC), and average local efficiency (ALE). Our findings indicate that results can significantly differ based on which preprocessing steps are selected. We also found dependence between motion and GT measurements in most preprocessing strategies. We conclude that by using censoring based on outliers within the functional time-series as a processing, results indicate an increase in reliability of GT measurements with a reduction of the dependency of head motion.

  2. A Conversation on Data Mining Strategies in LC-MS Untargeted Metabolomics: Pre-Processing and Pre-Treatment Steps.

    Science.gov (United States)

    Tugizimana, Fidele; Steenkamp, Paul A; Piater, Lizelle A; Dubery, Ian A

    2016-11-03

    Untargeted metabolomic studies generate information-rich, high-dimensional, and complex datasets that remain challenging to handle and fully exploit. Despite the remarkable progress in the development of tools and algorithms, the "exhaustive" extraction of information from these metabolomic datasets is still a non-trivial undertaking. A conversation on data mining strategies for a maximal information extraction from metabolomic data is needed. Using a liquid chromatography-mass spectrometry (LC-MS)-based untargeted metabolomic dataset, this study explored the influence of collection parameters in the data pre-processing step, scaling and data transformation on the statistical models generated, and feature selection, thereafter. Data obtained in positive mode generated from a LC-MS-based untargeted metabolomic study (sorghum plants responding dynamically to infection by a fungal pathogen) were used. Raw data were pre-processed with MarkerLynx(TM) software (Waters Corporation, Manchester, UK). Here, two parameters were varied: the intensity threshold (50-100 counts) and the mass tolerance (0.005-0.01 Da). After the pre-processing, the datasets were imported into SIMCA (Umetrics, Umea, Sweden) for more data cleaning and statistical modeling. In addition, different scaling (unit variance, Pareto, etc.) and data transformation (log and power) methods were explored. The results showed that the pre-processing parameters (or algorithms) influence the output dataset with regard to the number of defined features. Furthermore, the study demonstrates that the pre-treatment of data prior to statistical modeling affects the subspace approximation outcome: e.g., the amount of variation in X-data that the model can explain and predict. The pre-processing and pre-treatment steps subsequently influence the number of statistically significant extracted/selected features (variables). Thus, as informed by the results, to maximize the value of untargeted metabolomic data, understanding

  3. THREE PRE-PROCESSING STEPS TO INCREASE THE QUALITY OF KINECT RANGE DATA

    Directory of Open Access Journals (Sweden)

    M. Davoodianidaliki

    2013-09-01

    Full Text Available By developing technology with current rate, and increase in usage of active sensors in Close-Range Photogrammetry and Computer Vision, Range Images are the main extra data which has been added to the collection of present ones. Though main output of these data is point cloud, Range Images themselves can be considered important pieces of information. Being a bridge between 2D and 3D data enables it to hold unique and important attributes. There are 3 following properties that are taken advantage of in this study. First attribute to be considered is "Neighborhood of Null pixels" which will add a new field about accuracy of parameters into point cloud. This new field can be used later for data registration and integration. When there is a conflict between points of different stations we can abandon those with lower accuracy field. Next, polynomial fitting to known plane regions is applied. This step can help to soften final point cloud and just applies to some applications. Classification and region tracking in a series of images is needed for this process to be applicable. Finally, there is break-line created by errors of data transfer software. The break-line is caused by loss of some pixels in data transfer and store, and Image will shift along break-line. This error occurs usually when camera moves fast and processor can't handle transfer process entirely. The proposed method performs based on Edge Detection where horizontal lines are used to recognize break-line and near-vertical lines are used to determine shift value.

  4. China Vows Tangible Steps to Save Energy and Cut Consumption

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    @@ China will take tangible measures to save energy and cut pollution in 2008. The country will continue eliminating outdated production facilities, including small thermal power generating units with a combined capacity of 13 million kilowatts, and facilities with 50 million tons of cement, 6 million tons of steel and 14 million tons ofiron.

  5. Spectral Difference in the Image Domain for Large Neighborhoods, a GEOBIA Pre-Processing Step for High Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Roeland de Kok

    2012-08-01

    Full Text Available Contrast plays an important role in the visual interpretation of imagery. To mimic visual interpretation and using contrast in a Geographic Object Based Image Analysis (GEOBIA environment, it is useful to consider an analysis for single pixel objects. This should be done before applying homogeneity criteria in the aggregation of pixels for the construction of meaningful image objects. The habit or “best practice” to start GEOBIA with pixel aggregation into homogeneous objects should come with the awareness that feature attributes for single pixels are at risk of becoming less accessible for further analysis. Single pixel contrast with image convolution on close neighborhoods is a standard technique, also applied in edge detection. This study elaborates on the analysis of close as well as much larger neighborhoods inside the GEOBIA domain. The applied calculations are limited to the first segmentation step for single pixel objects in order to produce additional feature attributes for objects of interest to be generated in further aggregation processes. The equation presented functions at a level that is considered an intermediary product in the sequential processing of imagery. The procedure requires intensive processor and memory capacity. The resulting feature attributes highlight not only contrasting pixels (edges but also contrasting areas of local pixel groups. The suggested approach can be extended and becomes useful in classifying artificial areas at national scales using high resolution satellite mosaics.

  6. Preprocessing of NMR metabolomics data.

    Science.gov (United States)

    Euceda, Leslie R; Giskeødegård, Guro F; Bathen, Tone F

    2015-05-01

    Metabolomics involves the large scale analysis of metabolites and thus, provides information regarding cellular processes in a biological sample. Independently of the analytical technique used, a vast amount of data is always acquired when carrying out metabolomics studies; this results in complex datasets with large amounts of variables. This type of data requires multivariate statistical analysis for its proper biological interpretation. Prior to multivariate analysis, preprocessing of the data must be carried out to remove unwanted variation such as instrumental or experimental artifacts. This review aims to outline the steps in the preprocessing of NMR metabolomics data and describe some of the methods to perform these. Since using different preprocessing methods may produce different results, it is important that an appropriate pipeline exists for the selection of the optimal combination of methods in the preprocessing workflow.

  7. LANDSAT data preprocessing

    Science.gov (United States)

    Austin, W. W.

    1983-01-01

    The effect on LANDSAT data of a Sun angle correction, an intersatellite LANDSAT-2 and LANDSAT-3 data range adjustment, and the atmospheric correction algorithm was evaluated. Fourteen 1978 crop year LACIE sites were used as the site data set. The preprocessing techniques were applied to multispectral scanner channel data and transformed data were plotted and used to analyze the effectiveness of the preprocessing techniques. Ratio transformations effectively reduce the need for preprocessing techniques to be applied directly to the data. Subtractive transformations are more sensitive to Sun angle and atmospheric corrections than ratios. Preprocessing techniques, other than those applied at the Goddard Space Flight Center, should only be applied as an option of the user. While performed on LANDSAT data the study results are also applicable to meteorological satellite data.

  8. Development and integration of block operations for data invariant automation of digital preprocessing and analysis of biological and biomedical Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2015-06-01

    High-throughput information extraction from large numbers of Raman spectra is becoming an increasingly taxing problem due to the proliferation of new applications enabled using advances in instrumentation. Fortunately, in many of these applications, the entire process can be automated, yielding reproducibly good results with significant time and cost savings. Information extraction consists of two stages, preprocessing and analysis. We focus here on the preprocessing stage, which typically involves several steps, such as calibration, background subtraction, baseline flattening, artifact removal, smoothing, and so on, before the resulting spectra can be further analyzed. Because the results of some of these steps can affect the performance of subsequent ones, attention must be given to the sequencing of steps, the compatibility of these sequences, and the propensity of each step to generate spectral distortions. We outline here important considerations to effect full automation of Raman spectral preprocessing: what is considered full automation; putative general principles to effect full automation; the proper sequencing of processing and analysis steps; conflicts and circularities arising from sequencing; and the need for, and approaches to, preprocessing quality control. These considerations are discussed and illustrated with biological and biomedical examples reflecting both successful and faulty preprocessing.

  9. Irrigation with desalinated water: A step toward increasing water saving and crop yields

    Science.gov (United States)

    Silber, Avner; Israeli, Yair; Elingold, Idan; Levi, Menashe; Levkovitch, Irit; Russo, David; Assouline, Shmuel

    2015-01-01

    We examined the impact of two different approaches to managing irrigation water salinity: salt leaching from the field ("conventional" management) and water desalination before field application ("alternative" management). Freshwater commonly used for irrigation (FW) and desalinated water (DS) were applied to the high-water-demanding crop banana at four different rates. Both irrigation rate and water salinity significantly affected yield. DS application consistently produced higher yields than FW, independently of irrigation rate. The highest yield for FW-irrigation was achieved with the highest irrigation rate, whereas the same yield was obtained in the case of DS-irrigation with practically half the amount of water. Yield decreased with FW-irrigation, even when the water salinity, ECi, was lower than the limit considered safe for soil and crops. Irrigating with FW provided a massive amount of salt which accumulated in the rhizosphere, inducing increased osmotic potential of the soil solution and impairing plant water uptake. Furthermore, applying the "conventional" management, a significant amount of salt is leached from the rhizosphere, accumulating in deeper soil layers, and eventually reaching groundwater reservoirs, thus contributing to the deterioration of both soil and water quality. Removal of salt excess from the water before it reaches the field by means of DS-irrigation may save significant amounts of irrigation water by reducing the salt leaching requirements while increasing yield and improving fruit quality, and decreasing salt load in the groundwater.

  10. The Registration of Knee Joint Images with Preprocessing

    Directory of Open Access Journals (Sweden)

    Zhenyan Ji

    2011-06-01

    Full Text Available the registration of CT and MR images is important to analyze the effect of PCL and ACL deficiency on knee joint. Because CT and MR images have different limitations, we need register CT and MR images of knee joint and then build a model to do an analysis of the stress distribution on knee joint. In our project, we adopt image registration based on mutual information. In the knee joint images, the information about adipose, muscle and other soft tissue affects the registration accuracy. To eliminate the interference, we propose a combined preprocessing solution BEBDO, which consists of five steps, image blurring, image enhancement, image blurring, image edge detection and image outline preprocessing. We also designed the algorithm of image outline preprocessing. At the end of the paper, an experiment is done to compare the image registration results without the preprocessing and with the preprocessing. The results prove that the preprocessing can improve the image registration accuracy.

  11. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  12. High speed preprocessing system

    Indian Academy of Sciences (India)

    M Sankar Kishore

    2000-10-01

    In systems employing tracking, the area of interest is recognized using a high resolution camera and is handed overto the low resolution receiver. The images seen by the low resolution receiver and by the operator through the high resolution camera are different in spatial resolution. In order to establish the correlation between these two images, the high-resolution camera image needsto be preprocessed and made similar to the low-resolution receiver image. This paper discusses the implementation of a suitable preprocessing technique, emphasis being given to develop a system both in hardware and software to reduce processing time. By applying different software/hardware techniques, the execution time has been brought down from a few seconds to a few milliseconds for a typical set of conditions. The hardware is designed around i486 processors and software is developed in PL/M. The system is tested to match the images obtained by two different sensors of the same scene. The hardware and software have been evaluated with different sets of images.

  13. Forensic considerations for preprocessing effects on clinical MDCT scans.

    Science.gov (United States)

    Wade, Andrew D; Conlogue, Gerald J

    2013-05-01

    Manipulation of digital photographs destined for medico-legal inquiry must be thoroughly documented and presented with explanation of any manipulations. Unlike digital photography, computed tomography (CT) data must pass through an additional step before viewing. Reconstruction of raw data involves reconstruction algorithms to preprocess the raw information into display data. Preprocessing of raw data, although it occurs at the source, alters the images and must be accounted for in the same way as postprocessing. Repeated CT scans of a gunshot wound phantom were made using the Toshiba Aquilion 64-slice multidetector CT scanner. The appearance of fragments, high-density inclusion artifacts, and soft tissue were assessed. Preprocessing with different algorithms results in substantial differences in image output. It is important to appreciate that preprocessing affects the image, that it does so differently in the presence of high-density inclusions, and that preprocessing algorithms and scanning parameters may be used to overcome the resulting artifacts.

  14. A review of statistical methods for preprocessing oligonucleotide microarrays.

    Science.gov (United States)

    Wu, Zhijin

    2009-12-01

    Microarrays have become an indispensable tool in biomedical research. This powerful technology not only makes it possible to quantify a large number of nucleic acid molecules simultaneously, but also produces data with many sources of noise. A number of preprocessing steps are therefore necessary to convert the raw data, usually in the form of hybridisation images, to measures of biological meaning that can be used in further statistical analysis. Preprocessing of oligonucleotide arrays includes image processing, background adjustment, data normalisation/transformation and sometimes summarisation when multiple probes are used to target one genomic unit. In this article, we review the issues encountered in each preprocessing step and introduce the statistical models and methods in preprocessing.

  15. Preprocessing of raw metabonomic data.

    Science.gov (United States)

    Vettukattil, Riyas

    2015-01-01

    Recent advances in metabolic profiling techniques allow global profiling of metabolites in cells, tissues, or organisms, using a wide range of analytical techniques such as nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry (MS). The raw data acquired from these instruments are abundant with technical and structural complexity, which makes it statistically difficult to extract meaningful information. Preprocessing involves various computational procedures where data from the instruments (gas chromatography (GC)/liquid chromatography (LC)-MS, NMR spectra) are converted into a usable form for further analysis and biological interpretation. This chapter covers the common data preprocessing techniques used in metabonomics and is primarily focused on baseline correction, normalization, scaling, peak alignment, detection, and quantification. Recent years have witnessed development of several software tools for data preprocessing, and an overview of the frequently used tools in data preprocessing pipeline is covered.

  16. Solid Earth ARISTOTELES mission data preprocessing simulation of gravity gradiometer

    Science.gov (United States)

    Avanzi, G.; Stolfa, R.; Versini, B.

    Data preprocessing of the ARISTOTELES mission, which measures the Earth gravity gradient in a near polar orbit, was studied. The mission measures the gravity field at sea level through indirect measurements performed on the orbit, so that the evaluation steps consist in processing data from GRADIO accelerometer measurements. Due to the physical phenomena involved in the data collection experiment, it is possible to isolate at an initial stage a preprocessing of the gradiometer data based only on GRADIO measurements and not needing a detailed knowledge of the attitude and attitude rate sensors output. This preprocessing produces intermediate quantities used in future stages of the reduction. Software was designed and run to evaluate for this level of data reduction the achievable accuracy as a function of knowledge on instrument and satellite status parameters. The architecture of this element of preprocessing is described.

  17. MODIStsp: An R package for automatic preprocessing of MODIS Land Products time series

    Science.gov (United States)

    Busetto, L.; Ranghetti, L.

    2016-12-01

    MODIStsp is a new R package allowing automating the creation of raster time series derived from MODIS Land Products. It allows performing several preprocessing steps (e.g. download, mosaicing, reprojection and resize) on MODIS products on a selected time period and area. All processing parameters can be set with a user-friendly GUI, allowing users to select which specific layers of the original MODIS HDF files have to be processed and which Quality Indicators have to be extracted from the aggregated MODIS Quality Assurance layers. Moreover, the tool allows on-the-fly computation of time series of Spectral Indexes (either standard or custom-specified by the user through the GUI) from surface reflectance bands. Outputs are saved as single-band rasters corresponding to each available acquisition date and output layer. Virtual files allowing easy access to the entire time series as a single file using common image processing/GIS software or R scripts can be also created. Non-interactive execution within an R script and stand-alone execution outside an R environment exploiting a previously created Options File are also possible, the latter allowing scheduling execution of MODIStsp to automatically update a time series when a new image is available. The proposed software constitutes a very useful tool for the Remote Sensing community, since it allows performing all the main preprocessing steps required for the creation of time series of MODIS data within a common framework, and without requiring any particular programming skills by its users.

  18. Data preprocessing in data mining

    CERN Document Server

    García, Salvador; Herrera, Francisco

    2015-01-01

    Data Preprocessing for Data Mining addresses one of the most important issues within the well-known Knowledge Discovery from Data process. Data directly taken from the source will likely have inconsistencies, errors or most importantly, it is not ready to be considered for a data mining process. Furthermore, the increasing amount of data in recent science, industry and business applications, calls to the requirement of more complex tools to analyze it. Thanks to data preprocessing, it is possible to convert the impossible into possible, adapting the data to fulfill the input demands of each data mining algorithm. Data preprocessing includes the data reduction techniques, which aim at reducing the complexity of the data, detecting or removing irrelevant and noisy elements from the data. This book is intended to review the tasks that fill the gap between the data acquisition from the source and the data mining process. A comprehensive look from a practical point of view, including basic concepts and surveying t...

  19. Exploration, visualization, and preprocessing of high-dimensional data.

    Science.gov (United States)

    Wu, Zhijin; Wu, Zhiqiang

    2010-01-01

    The rapid advances in biotechnology have given rise to a variety of high-dimensional data. Many of these data, including DNA microarray data, mass spectrometry protein data, and high-throughput screening (HTS) assay data, are generated by complex experimental procedures that involve multiple steps such as sample extraction, purification and/or amplification, labeling, fragmentation, and detection. Therefore, the quantity of interest is not directly obtained and a number of preprocessing procedures are necessary to convert the raw data into the format with biological relevance. This also makes exploratory data analysis and visualization essential steps to detect possible defects, anomalies or distortion of the data, to test underlying assumptions and thus ensure data quality. The characteristics of the data structure revealed in exploratory analysis often motivate decisions in preprocessing procedures to produce data suitable for downstream analysis. In this chapter we review the common techniques in exploring and visualizing high-dimensional data and introduce the basic preprocessing procedures.

  20. Optimal Preprocessing Of GPS Data

    Science.gov (United States)

    Wu, Sien-Chong; Melbourne, William G.

    1994-01-01

    Improved technique for preprocessing data from Global Positioning System receiver reduces processing time and number of data to be stored. Optimal in sense that it maintains strength of data. Also increases ability to resolve ambiguities in numbers of cycles of received GPS carrier signals.

  1. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    Skibsted, E.; Boelens, H.F.M.; Westerhuis, J.A.; Witte, D.T.; Smilde, A.K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing

  2. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    Skibsted, E.; Boelens, H.F.M.; Westerhuis, J.A.; Witte, D.T.; Smilde, A.K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing tec

  3. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    E. Skibsted; H.F.M. Boelens; J.A. Westerhuis; D.T. Witte; A.K. Smilde

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing tec

  4. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  5. Preprocessing of compressed digital video

    Science.gov (United States)

    Segall, C. Andrew; Karunaratne, Passant V.; Katsaggelos, Aggelos K.

    2000-12-01

    Pre-processing algorithms improve on the performance of a video compression system by removing spurious noise and insignificant features from the original images. This increases compression efficiency and attenuates coding artifacts. Unfortunately, determining the appropriate amount of pre-filtering is a difficult problem, as it depends on both the content of an image as well as the target bit-rate of compression algorithm. In this paper, we explore a pre- processing technique that is loosely coupled to the quantization decisions of a rate control mechanism. This technique results in a pre-processing system that operates directly on the Displaced Frame Difference (DFD) and is applicable to any standard-compatible compression system. Results explore the effect of several standard filters on the DFD. An adaptive technique is then considered.

  6. Evaluating the impact of image preprocessing on iris segmentation

    Directory of Open Access Journals (Sweden)

    José F. Valencia-Murillo

    2014-08-01

    Full Text Available Segmentation is one of the most important stages in iris recognition systems. In this paper, image preprocessing algorithms are applied in order to evaluate their impact on successful iris segmentation. The preprocessing algorithms are based on histogram adjustment, Gaussian filters and suppression of specular reflections in human eye images. The segmentation method introduced by Masek is applied on 199 images acquired under unconstrained conditions, belonging to the CASIA-irisV3 database, before and after applying the preprocessing algorithms. Then, the impact of image preprocessing algorithms on the percentage of successful iris segmentation is evaluated by means of a visual inspection of images in order to determine if circumferences of iris and pupil were detected correctly. An increase from 59% to 73% in percentage of successful iris segmentation is obtained with an algorithm that combine elimination of specular reflections, followed by the implementation of a Gaussian filter having a 5x5 kernel. The results highlight the importance of a preprocessing stage as a previous step in order to improve the performance during the edge detection and iris segmentation processes.

  7. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting......, there is so far no systematic research to study and compare their performance. How to select effective techniques of feature preprocessing in a forecasting model remains a problem. In this paper, the authors conduct a comprehensive study of existing feature preprocessing techniques to evaluate their empirical...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  8. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  9. Preprocessing and Analysis of LC-MS-Based Proteomic Data.

    Science.gov (United States)

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed.

  10. Measuring Savings

    OpenAIRE

    Mark Schreiner

    2001-01-01

    Development depends on saving. But what exactly is saving, and how is it measured? This paper defines saving and describes several measures of financial savings. The measures account for the passage of time and for the three stages of saving: putting in (depositing), keeping in (maintaining a balance), and taking out (withdrawing). Together, the different measures capture how people move financial resources through time.

  11. Preprocessing and Morphological Analysis in Text Mining

    Directory of Open Access Journals (Sweden)

    Krishna Kumar Mohbey Sachin Tiwari

    2011-12-01

    Full Text Available This paper is based on the preprocessing activities which is performed by the software or language translators before applying mining algorithms on the huge data. Text mining is an important area of Data mining and it plays a vital role for extracting useful information from the huge database or data ware house. But before applying the text mining or information extraction process, preprocessing is must because the given data or dataset have the noisy, incomplete, inconsistent, dirty and unformatted data. In this paper we try to collect the necessary requirements for preprocessing. When we complete the preprocess task then we can easily extract the knowledgful information using mining strategy. This paper also provides the information about the analysis of data like tokenization, stemming and semantic analysis like phrase recognition and parsing. This paper also collect the procedures for preprocessing data i.e. it describe that how the stemming, tokenization or parsing are applied.

  12. Facilitating Watermark Insertion by Preprocessing Media

    Directory of Open Access Journals (Sweden)

    Matt L. Miller

    2004-10-01

    Full Text Available There are several watermarking applications that require the deployment of a very large number of watermark embedders. These applications often have severe budgetary constraints that limit the computation resources that are available. Under these circumstances, only simple embedding algorithms can be deployed, which have limited performance. In order to improve performance, we propose preprocessing the original media. It is envisaged that this preprocessing occurs during content creation and has no budgetary or computational constraints. Preprocessing combined with simple embedding creates a watermarked Work, the performance of which exceeds that of simple embedding alone. However, this performance improvement is obtained without any increase in the computational complexity of the embedder. Rather, the additional computational burden is shifted to the preprocessing stage. A simple example of this procedure is described and experimental results confirm our assertions.

  13. Preprocessing: A Step in Automating Early Detection of Cervical Cancer

    CERN Document Server

    Das, Abhishek; Bhattacharyya, Debasis

    2011-01-01

    Uterine Cervical Cancer is one of the most common forms of cancer in women worldwide. Most cases of cervical cancer can be prevented through screening programs aimed at detecting precancerous lesions. During Digital Colposcopy, colposcopic images or cervigrams are acquired in raw form. They contain specular reflections which appear as bright spots heavily saturated with white light and occur due to the presence of moisture on the uneven cervix surface and. The cervix region occupies about half of the raw cervigram image. Other parts of the image contain irrelevant information, such as equipment, frames, text and non-cervix tissues. This irrelevant information can confuse automatic identification of the tissues within the cervix. Therefore we focus on the cervical borders, so that we have a geometric boundary on the relevant image area. Our novel technique eliminates the SR, identifies the region of interest and makes the cervigram ready for segmentation algorithms.

  14. Optimization of miRNA-seq data preprocessing.

    Science.gov (United States)

    Tam, Shirley; Tsao, Ming-Sound; McPherson, John D

    2015-11-01

    The past two decades of microRNA (miRNA) research has solidified the role of these small non-coding RNAs as key regulators of many biological processes and promising biomarkers for disease. The concurrent development in high-throughput profiling technology has further advanced our understanding of the impact of their dysregulation on a global scale. Currently, next-generation sequencing is the platform of choice for the discovery and quantification of miRNAs. Despite this, there is no clear consensus on how the data should be preprocessed before conducting downstream analyses. Often overlooked, data preprocessing is an essential step in data analysis: the presence of unreliable features and noise can affect the conclusions drawn from downstream analyses. Using a spike-in dilution study, we evaluated the effects of several general-purpose aligners (BWA, Bowtie, Bowtie 2 and Novoalign), and normalization methods (counts-per-million, total count scaling, upper quartile scaling, Trimmed Mean of M, DESeq, linear regression, cyclic loess and quantile) with respect to the final miRNA count data distribution, variance, bias and accuracy of differential expression analysis. We make practical recommendations on the optimal preprocessing methods for the extraction and interpretation of miRNA count data from small RNA-sequencing experiments.

  15. Household Savings

    DEFF Research Database (Denmark)

    Browning, Martin; Lusardi, Annamaria

    In this survey, we review the recent theoretical and empirical literature on household saving and consumption. The discussion is structured around a list of motives for saving and how well the standard theory captures these motives. We show that almost all of the motives for saving that have been...

  16. Preprocessing of ionospheric echo Doppler spectra

    Institute of Scientific and Technical Information of China (English)

    FANG Liang; ZHAO Zhengyu; WANG Feng; SU Fanfan

    2007-01-01

    The real-time information of the distant ionosphere can be acquired by using the Wuhan ionospheric oblique backscattering sounding system(WIOBSS),which adopts a discontinuous wave mechanism.After the characteristics of the ionospheric echo Doppler spectra were analyzed,the signal preprocessing was developed in this paper,which aimed at improving the Doppler spectra.The results indicate that the preprocessing not only makes the system acquire a higher ability of target detection but also suppresses the radio frequency interference by 6-7 dB.

  17. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  18. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Science.gov (United States)

    Churchill, Nathan W; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline") significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  19. Preprocessing Moist Lignocellulosic Biomass for Biorefinery Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Neal Yancey; Christopher T. Wright; Craig Conner; J. Richard Hess

    2009-06-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system of a lignocellulosic biorefinery. Preprocessing is generally accomplished using industrial grinders to format biomass materials into a suitable biorefinery feedstock for conversion to ethanol and other bioproducts. Many factors affect machine efficiency and the physical characteristics of preprocessed biomass. For example, moisture content of the biomass as received from the point of production has a significant impact on overall system efficiency and can significantly affect the characteristics (particle size distribution, flowability, storability, etc.) of the size-reduced biomass. Many different grinder configurations are available on the market, each with advantages under specific conditions. Ultimately, the capacity and/or efficiency of the grinding process can be enhanced by selecting the grinder configuration that optimizes grinder performance based on moisture content and screen size. This paper discusses the relationships of biomass moisture with respect to preprocessing system performance and product physical characteristics and compares data obtained on corn stover, switchgrass, and wheat straw as model feedstocks during Vermeer HG 200 grinder testing. During the tests, grinder screen configuration and biomass moisture content were varied and tested to provide a better understanding of their relative impact on machine performance and the resulting feedstock physical characteristics and uniformity relative to each crop tested.

  20. Preprocessing Moist Lignocellulosic Biomass for Biorefinery Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Neal Yancey; Christopher T. Wright; Craig Conner; J. Richard Hess

    2009-06-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system of a lignocellulosic biorefinery. Preprocessing is generally accomplished using industrial grinders to format biomass materials into a suitable biorefinery feedstock for conversion to ethanol and other bioproducts. Many factors affect machine efficiency and the physical characteristics of preprocessed biomass. For example, moisture content of the biomass as received from the point of production has a significant impact on overall system efficiency and can significantly affect the characteristics (particle size distribution, flowability, storability, etc.) of the size-reduced biomass. Many different grinder configurations are available on the market, each with advantages under specific conditions. Ultimately, the capacity and/or efficiency of the grinding process can be enhanced by selecting the grinder configuration that optimizes grinder performance based on moisture content and screen size. This paper discusses the relationships of biomass moisture with respect to preprocessing system performance and product physical characteristics and compares data obtained on corn stover, switchgrass, and wheat straw as model feedstocks during Vermeer HG 200 grinder testing. During the tests, grinder screen configuration and biomass moisture content were varied and tested to provide a better understanding of their relative impact on machine performance and the resulting feedstock physical characteristics and uniformity relative to each crop tested.

  1. Efficient Preprocessing technique using Web log mining

    Science.gov (United States)

    Raiyani, Sheetal A.; jain, Shailendra

    2012-11-01

    Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

  2. A PREPROCESSING LS-CMA IN HIGHLY CORRUPTIVE ENVIRONMENT

    Institute of Scientific and Technical Information of China (English)

    Guo Yan; Fang Dagang; Thomas N.C.Wang; Liang Changhong

    2002-01-01

    A fast preprocessing Least Square-Constant Modulus Algorithm (LS-CMA) is proposed for blind adaptive beamforming. This new preprocessing method precludes noise capture caused by the original LS-CMA with the preprocessing procedure controlled by the static Constant Modulus Algorithm (CMA). The simulation results have shown that the proposed fast preprocessing LS-CMA can effectively reject the co-channel interference, and quickly lock onto the constant modulus desired signal with only one snapshot in a highly corruptive environment.

  3. The preprocessing of multispectral data. II. [of Landsat satellite

    Science.gov (United States)

    Quiel, F.

    1976-01-01

    It is pointed out that a correction of atmospheric effects is an important requirement for a full utilization of the possibilities provided by preprocessing techniques. The most significant characteristics of original and preprocessed data are considered, taking into account the solution of classification problems by means of the preprocessing procedure. Improvements obtainable with different preprocessing techniques are illustrated with the aid of examples involving Landsat data regarding an area in Colorado.

  4. Preprocessing of GPR data for syntactic landmine detection and classification

    Science.gov (United States)

    Nasif, Ahmed O.; Hintz, Kenneth J.; Peixoto, Nathalia

    2010-04-01

    Syntactic pattern recognition is being used to detect and classify non-metallic landmines in terms of their range impedance discontinuity profile. This profile, extracted from the ground penetrating radar's return signal, constitutes a high-range-resolution and unique description of the inner structure of a landmine. In this paper, we discuss two preprocessing steps necessary to extract such a profile, namely, inverse filtering (deconvolving) and binarization. We validate the use of an inverse filter to effectively decompose the observed composite signal resulting from the different layers of dielectric materials of a landmine. It is demonstrated that the transmitted radar waveform undergoing multiple reflections with different materials does not change appreciably, and mainly depends on the transmit and receive processing chains of the particular radar being used. Then, a new inversion approach for the inverse filter is presented based on the cumulative contribution of the different frequency components to the original Fourier spectrum. We discuss the tradeoffs and challenges involved in such a filter design. The purpose of the binarization scheme is to localize the impedance discontinuities in range, by assigning a '1' to the peaks of the inverse filtered output, and '0' to all other values. The paper is concluded with simulation results showing the effectiveness of the proposed preprocessing technique.

  5. Optimization of Preprocessing and Densification of Sorghum Stover at Full-scale Operation

    Energy Technology Data Exchange (ETDEWEB)

    Neal A. Yancey; Jaya Shankar Tumuluru; Craig C. Conner; Christopher T. Wright

    2011-08-01

    Transportation costs can be a prohibitive step in bringing biomass to a preprocessing location or biofuel refinery. One alternative to transporting biomass in baled or loose format to a preprocessing location, is to utilize a mobile preprocessing system that can be relocated to various locations where biomass is stored, preprocess and densify the biomass, then ship it to the refinery as needed. The Idaho National Laboratory has a full scale 'Process Demonstration Unit' PDU which includes a stage 1 grinder, hammer mill, drier, pellet mill, and cooler with the associated conveyance system components. Testing at bench and pilot scale has been conducted to determine effects of moisture on preprocessing, crop varieties on preprocessing efficiency and product quality. The INLs PDU provides an opportunity to test the conclusions made at the bench and pilot scale on full industrial scale systems. Each component of the PDU is operated from a central operating station where data is collected to determine power consumption rates for each step in the process. The power for each electrical motor in the system is monitored from the control station to monitor for problems and determine optimal conditions for the system performance. The data can then be viewed to observe how changes in biomass input parameters (moisture and crop type for example), mechanical changes (screen size, biomass drying, pellet size, grinding speed, etc.,), or other variations effect the power consumption of the system. Sorgum in four foot round bales was tested in the system using a series of 6 different screen sizes including: 3/16 in., 1 in., 2 in., 3 in., 4 in., and 6 in. The effect on power consumption, product quality, and production rate were measured to determine optimal conditions.

  6. Saving Energy

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    More and more Chinese consumers are beginning to think about their energy-saving needs when buying apartments. The energy-saving estate market is huge, which provides great opportunities for companies in the business, especially foreign investors. But just how big is this market and how is it developing? Which fields in this industry are suitable for foreign businesses? With these questions in mind, Beijing Review spoke with Li Shuren, Vice Secretary General of the China Real Estate and Housing Research ...

  7. Pre-processing in AI based Prediction of QSARs

    CERN Document Server

    Patri, Om Prasad

    2009-01-01

    Machine learning, data mining and artificial intelligence (AI) based methods have been used to determine the relations between chemical structure and biological activity, called quantitative structure activity relationships (QSARs) for the compounds. Pre-processing of the dataset, which includes the mapping from a large number of molecular descriptors in the original high dimensional space to a small number of components in the lower dimensional space while retaining the features of the original data, is the first step in this process. A common practice is to use a mapping method for a dataset without prior analysis. This pre-analysis has been stressed in our work by applying it to two important classes of QSAR prediction problems: drug design (predicting anti-HIV-1 activity) and predictive toxicology (estimating hepatocarcinogenicity of chemicals). We apply one linear and two nonlinear mapping methods on each of the datasets. Based on this analysis, we conclude the nature of the inherent relationships betwee...

  8. Multiple Criteria Decision-Making Preprocessing Using Data Mining Tools

    CERN Document Server

    Mosavi, A

    2010-01-01

    Real-life engineering optimization problems need Multiobjective Optimization (MOO) tools. These problems are highly nonlinear. As the process of Multiple Criteria Decision-Making (MCDM) is much expanded most MOO problems in different disciplines can be classified on the basis of it. Thus MCDM methods have gained wide popularity in different sciences and applications. Meanwhile the increasing number of involved components, variables, parameters, constraints and objectives in the process, has made the process very complicated. However the new generation of MOO tools has made the optimization process more automated, but still initializing the process and setting the initial value of simulation tools and also identifying the effective input variables and objectives in order to reach the smaller design space are still complicated. In this situation adding a preprocessing step into the MCDM procedure could make a huge difference in terms of organizing the input variables according to their effects on the optimizati...

  9. Pre-processing Tasks in Indonesian Twitter Messages

    Science.gov (United States)

    Hidayatullah, A. F.; Ma’arif, M. R.

    2017-01-01

    Twitter text messages are very noisy. Moreover, tweet data are unstructured and complicated enough. The focus of this work is to investigate pre-processing technique for Twitter messages in Bahasa Indonesia. The main goal of this experiment is to clean the tweet data for further analysis. Thus, the objectives of this pre-processing task is simply removing all meaningless character and left valuable words. In this research, we divide our proposed pre-processing experiments into two parts. The first part is common pre-processing task. The second part is a specific pre-processing task for tweet data. From the experimental result we can conclude that by employing a specific pre-processing task related to tweet data characteristic we obtained more valuable result. The result obtained is better in terms of less meaningful word occurrence which is not significant in number comparing to the result obtained by just running common pre-processing tasks.

  10. Pre-Processing Effect on the Accuracy of Event-Based Activity Segmentation and Classification through Inertial Sensors

    Directory of Open Access Journals (Sweden)

    Benish Fida

    2015-09-01

    Full Text Available Inertial sensors are increasingly being used to recognize and classify physical activities in a variety of applications. For monitoring and fitness applications, it is crucial to develop methods able to segment each activity cycle, e.g., a gait cycle, so that the successive classification step may be more accurate. To increase detection accuracy, pre-processing is often used, with a concurrent increase in computational cost. In this paper, the effect of pre-processing operations on the detection and classification of locomotion activities was investigated, to check whether the presence of pre-processing significantly contributes to an increase in accuracy. The pre-processing stages evaluated in this study were inclination correction and de-noising. Level walking, step ascending, descending and running were monitored by using a shank-mounted inertial sensor. Raw and filtered segments, obtained from a modified version of a rule-based gait detection algorithm optimized for sequential processing, were processed to extract time and frequency-based features for physical activity classification through a support vector machine classifier. The proposed method accurately detected >99% gait cycles from raw data and produced >98% accuracy on these segmented gait cycles. Pre-processing did not substantially increase classification accuracy, thus highlighting the possibility of reducing the amount of pre-processing for real-time applications.

  11. Acquisition and preprocessing of LANDSAT data

    Science.gov (United States)

    Horn, T. N.; Brown, L. E.; Anonsen, W. H. (Principal Investigator)

    1979-01-01

    The original configuration of the GSFC data acquisition, preprocessing, and transmission subsystem, designed to provide LANDSAT data inputs to the LACIE system at JSC, is described. Enhancements made to support LANDSAT -2, and modifications for LANDSAT -3 are discussed. Registration performance throughout the 3 year period of LACIE operations satisfied the 1 pixel root-mean-square requirements established in 1974, with more than two of every three attempts at data registration proving successful, notwithstanding cosmetic faults or content inadequacies to which the process is inherently susceptible. The cloud/snow rejection rate experienced throughout the last 3 years has approached 50%, as expected in most LANDSAT data use situations.

  12. Approximate Distance Oracles with Improved Preprocessing Time

    CERN Document Server

    Wulff-Nilsen, Christian

    2011-01-01

    Given an undirected graph $G$ with $m$ edges, $n$ vertices, and non-negative edge weights, and given an integer $k\\geq 1$, we show that for some universal constant $c$, a $(2k-1)$-approximate distance oracle for $G$ of size $O(kn^{1 + 1/k})$ can be constructed in $O(\\sqrt km + kn^{1 + c/\\sqrt k})$ time and can answer queries in $O(k)$ time. We also give an oracle which is faster for smaller $k$. Our results break the quadratic preprocessing time bound of Baswana and Kavitha for all $k\\geq 6$ and improve the $O(kmn^{1/k})$ time bound of Thorup and Zwick except for very sparse graphs and small $k$. When $m = \\Omega(n^{1 + c/\\sqrt k})$ and $k = O(1)$, our oracle is optimal w.r.t.\\ both stretch, size, preprocessing time, and query time, assuming a widely believed girth conjecture by Erd\\H{o}s.

  13. An effective preprocessing method for finger vein recognition

    Science.gov (United States)

    Peng, JiaLiang; Li, Qiong; Wang, Ning; Abd El-Latif, Ahmed A.; Niu, Xiamu

    2013-07-01

    The image preprocessing plays an important role in finger vein recognition system. However, previous preprocessing schemes remind weakness to be resolved for the high finger vein recongtion performance. In this paper, we propose a new finger vein preprocessing that includes finger region localization, alignment, finger vein ROI segmentation and enhancement. The experimental results show that the proposed scheme is capable of enhancing the quality of finger vein image effectively and reliably.

  14. User microprogrammable processors for high data rate telemetry preprocessing

    Science.gov (United States)

    Pugsley, J. H.; Ogrady, E. P.

    1973-01-01

    The use of microprogrammable processors for the preprocessing of high data rate satellite telemetry is investigated. The following topics are discussed along with supporting studies: (1) evaluation of commercial microprogrammable minicomputers for telemetry preprocessing tasks; (2) microinstruction sets for telemetry preprocessing; and (3) the use of multiple minicomputers to achieve high data processing. The simulation of small microprogrammed processors is discussed along with examples of microprogrammed processors.

  15. Preprocessing and Analysis of Digitized ECGs

    Science.gov (United States)

    Villalpando, L. E. Piña; Kurmyshev, E.; Ramírez, S. Luna; Leal, L. Delgado

    2008-08-01

    In this work we propose a methodology and programs in MatlabTM that perform the preprocessing and analysis of the derivative D1 of ECGs. The program makes the correction to isoelectric line for each beat, calculates the average cardiac frequency and its standard deviation, generates a file of amplitude of P, Q and T waves, as well as the segments and intervals important of each beat. Software makes the normalization of beats to a standard rate of 80 beats per minute, the superposition of beats is done centering R waves, before and after normalizing the amplitude of each beat. The data and graphics provide relevant information to the doctor for diagnosis. In addition, some results are displayed similar to those presented by a Holter recording.

  16. Flexibility and utility of pre-processing methods in converting STXM setups for ptychography - Final Paper

    Energy Technology Data Exchange (ETDEWEB)

    Fromm, Catherine [SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2015-08-20

    Ptychography is an advanced diffraction based imaging technique that can achieve resolution of 5nm and below. It is done by scanning a sample through a beam of focused x-rays using discrete yet overlapping scan steps. Scattering data is collected on a CCD camera, and the phase of the scattered light is reconstructed with sophisticated iterative algorithms. Because the experimental setup is similar, ptychography setups can be created by retrofitting existing STXM beam lines with new hardware. The other challenge comes in the reconstruction of the collected scattering images. Scattering data must be adjusted and packaged with experimental parameters to calibrate the reconstruction software. The necessary pre-processing of data prior to reconstruction is unique to each beamline setup, and even the optical alignments used on that particular day. Pre-processing software must be developed to be flexible and efficient in order to allow experiments appropriate control and freedom in the analysis of their hard-won data. This paper will describe the implementation of pre-processing software which successfully connects data collection steps to reconstruction steps, letting the user accomplish accurate and reliable ptychography.

  17. The Effect of Preprocessing on Arabic Document Categorization

    Directory of Open Access Journals (Sweden)

    Abdullah Ayedh

    2016-04-01

    Full Text Available Preprocessing is one of the main components in a conventional document categorization (DC framework. This paper aims to highlight the effect of preprocessing tasks on the efficiency of the Arabic DC system. In this study, three classification techniques are used, namely, naive Bayes (NB, k-nearest neighbor (KNN, and support vector machine (SVM. Experimental analysis on Arabic datasets reveals that preprocessing techniques have a significant impact on the classification accuracy, especially with complicated morphological structure of the Arabic language. Choosing appropriate combinations of preprocessing tasks provides significant improvement on the accuracy of document categorization depending on the feature size and classification techniques. Findings of this study show that the SVM technique has outperformed the KNN and NB techniques. The SVM technique achieved 96.74% micro-F1 value by using the combination of normalization and stemming as preprocessing tasks.

  18. Spatial-spectral preprocessing for endmember extraction on GPU's

    Science.gov (United States)

    Jimenez, Luis I.; Plaza, Javier; Plaza, Antonio; Li, Jun

    2016-10-01

    Spectral unmixing is focused in the identification of spectrally pure signatures, called endmembers, and their corresponding abundances in each pixel of a hyperspectral image. Mainly focused on the spectral information contained in the hyperspectral images, endmember extraction techniques have recently included spatial information to achieve more accurate results. Several algorithms have been developed for automatic or semi-automatic identification of endmembers using spatial and spectral information, including the spectral-spatial endmember extraction (SSEE) where, within a preprocessing step in the technique, both sources of information are extracted from the hyperspectral image and equally used for this purpose. Previous works have implemented the SSEE technique in four main steps: 1) local eigenvectors calculation in each sub-region in which the original hyperspectral image is divided; 2) computation of the maxima and minima projection of all eigenvectors over the entire hyperspectral image in order to obtain a candidates pixels set; 3) expansion and averaging of the signatures of the candidate set; 4) ranking based on the spectral angle distance (SAD). The result of this method is a list of candidate signatures from which the endmembers can be extracted using various spectral-based techniques, such as orthogonal subspace projection (OSP), vertex component analysis (VCA) or N-FINDR. Considering the large volume of data and the complexity of the calculations, there is a need for efficient implementations. Latest- generation hardware accelerators such as commodity graphics processing units (GPUs) offer a good chance for improving the computational performance in this context. In this paper, we develop two different implementations of the SSEE algorithm using GPUs. Both are based on the eigenvectors computation within each sub-region of the first step, one using the singular value decomposition (SVD) and another one using principal component analysis (PCA). Based

  19. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... choose "Save target as…", save file in desired location. Firefox/Chrome: Right-click on the link, choose "Save link as…", save file in desired location. Safari: Right-click or command-click on the ...

  20. Feature detection techniques for preprocessing proteomic data.

    Science.gov (United States)

    Sellers, Kimberly F; Miecznikowski, Jeffrey C

    2010-01-01

    Numerous gel-based and nongel-based technologies are used to detect protein changes potentially associated with disease. The raw data, however, are abundant with technical and structural complexities, making statistical analysis a difficult task. Low-level analysis issues (including normalization, background correction, gel and/or spectral alignment, feature detection, and image registration) are substantial problems that need to be addressed, because any large-level data analyses are contingent on appropriate and statistically sound low-level procedures. Feature detection approaches are particularly interesting due to the increased computational speed associated with subsequent calculations. Such summary data corresponding to image features provide a significant reduction in overall data size and structure while retaining key information. In this paper, we focus on recent advances in feature detection as a tool for preprocessing proteomic data. This work highlights existing and newly developed feature detection algorithms for proteomic datasets, particularly relating to time-of-flight mass spectrometry, and two-dimensional gel electrophoresis. Note, however, that the associated data structures (i.e., spectral data, and images containing spots) used as input for these methods are obtained via all gel-based and nongel-based methods discussed in this manuscript, and thus the discussed methods are likewise applicable.

  1. Credit and savings: stepping stones from poverty.

    Science.gov (United States)

    2000-07-01

    This paper presents an interview with Mariame Dem, Zonal Program Manager for West Africa, concerning Oxfam's contribution to financial assets of women in helping them to achieve sustainable livelihoods. It was known that credit could improve the status of women and bargaining position within their household by giving them their own money. Dem said that the program has given grants for revolving credit funds, built women's literacy skills, and has kept them in touch with credit management organizations. Moreover, she said that Oxfam has invested in many micro-credit schemes for the empowerment of women. Their partners have set up a learning group and plan to train and network in the coming years.

  2. Multimodal image fusion with SIMS: Preprocessing with image registration.

    Science.gov (United States)

    Tarolli, Jay Gage; Bloom, Anna; Winograd, Nicholas

    2016-06-14

    In order to utilize complementary imaging techniques to supply higher resolution data for fusion with secondary ion mass spectrometry (SIMS) chemical images, there are a number of aspects that, if not given proper consideration, could produce results which are easy to misinterpret. One of the most critical aspects is that the two input images must be of the same exact analysis area. With the desire to explore new higher resolution data sources that exists outside of the mass spectrometer, this requirement becomes even more important. To ensure that two input images are of the same region, an implementation of the insight segmentation and registration toolkit (ITK) was developed to act as a preprocessing step before performing image fusion. This implementation of ITK allows for several degrees of movement between two input images to be accounted for, including translation, rotation, and scale transforms. First, the implementation was confirmed to accurately register two multimodal images by supplying a known transform. Once validated, two model systems, a copper mesh grid and a group of RAW 264.7 cells, were used to demonstrate the use of the ITK implementation to register a SIMS image with a microscopy image for the purpose of performing image fusion.

  3. Nonlinear preprocessing method for detecting peaks from gas chromatograms

    Directory of Open Access Journals (Sweden)

    Min Hyeyoung

    2009-11-01

    Full Text Available Abstract Background The problem of locating valid peaks from data corrupted by noise frequently arises while analyzing experimental data. In various biological and chemical data analysis tasks, peak detection thus constitutes a critical preprocessing step that greatly affects downstream analysis and eventual quality of experiments. Many existing techniques require the users to adjust parameters by trial and error, which is error-prone, time-consuming and often leads to incorrect analysis results. Worse, conventional approaches tend to report an excessive number of false alarms by finding fictitious peaks generated by mere noise. Results We have designed a novel peak detection method that can significantly reduce parameter sensitivity, yet providing excellent peak detection performance and negligible false alarm rates from gas chromatographic data. The key feature of our new algorithm is the successive use of peak enhancement algorithms that are deliberately designed for a gradual improvement of peak detection quality. We tested our approach with real gas chromatograms as well as intentionally contaminated spectra that contain Gaussian or speckle-type noise. Conclusion Our results demonstrate that the proposed method can achieve near perfect peak detection performance while maintaining very small false alarm probabilities in case of gas chromatograms. Given the fact that biological signals appear in the form of peaks in various experimental data and that the propose method can easily be extended to such data, our approach will be a useful and robust tool that can help researchers highlight valid signals in their noisy measurements.

  4. Enhanced bone structural analysis through pQCT image preprocessing.

    Science.gov (United States)

    Cervinka, T; Hyttinen, J; Sievanen, H

    2010-05-01

    Several factors, including preprocessing of the image, can affect the reliability of pQCT-measured bone traits, such as cortical area and trabecular density. Using repeated scans of four different liquid phantoms and repeated in vivo scans of distal tibiae from 25 subjects, the performance of two novel preprocessing methods, based on the down-sampling of grayscale intensity histogram and the statistical approximation of image data, was compared to 3 x 3 and 5 x 5 median filtering. According to phantom measurements, the signal to noise ratio in the raw pQCT images (XCT 3000) was low ( approximately 20dB) which posed a challenge for preprocessing. Concerning the cortical analysis, the reliability coefficient (R) was 67% for the raw image and increased to 94-97% after preprocessing without apparent preference for any method. Concerning the trabecular density, the R-values were already high ( approximately 99%) in the raw images leaving virtually no room for improvement. However, some coarse structural patterns could be seen in the preprocessed images in contrast to a disperse distribution of density levels in the raw image. In conclusion, preprocessing cannot suppress the high noise level to the extent that the analysis of mean trabecular density is essentially improved, whereas preprocessing can enhance cortical bone analysis and also facilitate coarse structural analyses of the trabecular region.

  5. A Study on Pre-processing Algorithms for Metal Parts Inspection

    Directory of Open Access Journals (Sweden)

    Haider Sh. Hashim

    2011-06-01

    Full Text Available Pre-processing is very useful in a variety of situations since it helps to suppress information that is not related to the exact image processing or analysis task. Mathematical morphology is used for analysis, understanding and image processing. It is an influential method in the geometric morphological analysis and image understanding. It has befallen a new theory in the digital image processing domain. Edges detection and noise reduction are a crucial and very important pre-processing step. The classical edge detection methods and filtering are less accurate in detecting complex edge and filtering various types of noise. This paper proposed some useful mathematic morphological techniques to detect edge and to filter noise in metal parts image. The experimental result showed that the proposed algorithm helps to increase accuracy of metal parts inspection system.

  6. Effective automated prediction of vertebral column pathologies based on logistic model tree with SMOTE preprocessing.

    Science.gov (United States)

    Karabulut, Esra Mahsereci; Ibrikci, Turgay

    2014-05-01

    This study develops a logistic model tree based automation system based on for accurate recognition of types of vertebral column pathologies. Six biomechanical measures are used for this purpose: pelvic incidence, pelvic tilt, lumbar lordosis angle, sacral slope, pelvic radius and grade of spondylolisthesis. A two-phase classification model is employed in which the first step is preprocessing the data by use of Synthetic Minority Over-sampling Technique (SMOTE), and the second one is feeding the classifier Logistic Model Tree (LMT) with the preprocessed data. We have achieved an accuracy of 89.73 %, and 0.964 Area Under Curve (AUC) in computer based automatic detection of the pathology. This was validated via a 10-fold-cross-validation experiment conducted on clinical records of 310 patients. The study also presents a comparative analysis of the vertebral column data with the use of several machine learning algorithms.

  7. The Combined Effect of Filters in ECG Signals for Pre-Processing

    Directory of Open Access Journals (Sweden)

    Isha V. Upganlawar

    2014-05-01

    Full Text Available The ECG signal is abruptly changing and continuous in nature. The heart disease such as paroxysmal of heart, arrhythmia diagnosing, are related with the intelligent health care decision this ECG signal need to be pre-process accurately for further action on it such as extracting the features, wavelet decomposition, distribution of QRS complexes in ECG recordings and related information such as heart rate and RR interval, classification of the signal by using various classifiers etc. Filters plays very important role in analyzing the low frequency components in ECG signal. The biomedical signals are of low frequency, the removal of power line interference and baseline wander is a very important step at the pre-processing stage of ECG. In these paper we deal with the study of Median filtering and FIR (Finite Impulse Responsefiltering of ECG signals under noisy condition

  8. Prognosis classification in glioblastoma multiforme using multimodal MRI derived heterogeneity textural features: impact of pre-processing choices

    Science.gov (United States)

    Upadhaya, Taman; Morvan, Yannick; Stindel, Eric; Le Reste, Pierre-Jean; Hatt, Mathieu

    2016-03-01

    Heterogeneity image-derived features of Glioblastoma multiforme (GBM) tumors from multimodal MRI sequences may provide higher prognostic value than standard parameters used in routine clinical practice. We previously developed a framework for automatic extraction and combination of image-derived features (also called "Radiomics") through support vector machines (SVM) for predictive model building. The results we obtained in a cohort of 40 GBM suggested these features could be used to identify patients with poorer outcome. However, extraction of these features is a delicate multi-step process and their values may therefore depend on the pre-processing of images. The original developed workflow included skull removal, bias homogeneity correction, and multimodal tumor segmentation, followed by textural features computation, and lastly ranking, selection and combination through a SVM-based classifier. The goal of the present work was to specifically investigate the potential benefit and respective impact of the addition of several MRI pre-processing steps (spatial resampling for isotropic voxels, intensities quantization and normalization) before textural features computation, on the resulting accuracy of the classifier. Eighteen patients datasets were also added for the present work (58 patients in total). A classification accuracy of 83% (sensitivity 79%, specificity 85%) was obtained using the original framework. The addition of the new pre-processing steps increased it to 93% (sensitivity 93%, specificity 93%) in identifying patients with poorer survival (below the median of 12 months). Among the three considered pre-processing steps, spatial resampling was found to have the most important impact. This shows the crucial importance of investigating appropriate image pre-processing steps to be used for methodologies based on textural features extraction in medical imaging.

  9. Performance evaluation of preprocessing techniques utilizing expert information in multivariate calibration.

    Science.gov (United States)

    Sharma, Sandeep; Goodarzi, Mohammad; Ramon, Herman; Saeys, Wouter

    2014-04-01

    Partial Least Squares (PLS) regression is one of the most used methods for extracting chemical information from Near Infrared (NIR) spectroscopic measurements. The success of a PLS calibration relies largely on the representativeness of the calibration data set. This is not trivial, because not only the expected variation in the analyte of interest, but also the variation of other contributing factors (interferents) should be included in the calibration data. This also implies that changes in interferent concentrations not covered in the calibration step can deteriorate the prediction ability of the calibration model. Several researchers have suggested that PLS models can be robustified against changes in the interferent structure by incorporating expert knowledge in the preprocessing step with the aim to efficiently filter out the spectral influence of the spectral interferents. However, these methods have not yet been compared against each other. Therefore, in the present study, various preprocessing techniques exploiting expert knowledge were compared on two experimental data sets. In both data sets, the calibration and test set were designed to have a different interferent concentration range. The performance of these techniques was compared to that of preprocessing techniques which do not use any expert knowledge. Using expert knowledge was found to improve the prediction performance for both data sets. For data set-1, the prediction error improved nearly 32% when pure component spectra of the analyte and the interferents were used in the Extended Multiplicative Signal Correction framework. Similarly, for data set-2, nearly 63% improvement in the prediction error was observed when the interferent information was utilized in Spectral Interferent Subtraction preprocessing.

  10. On image pre-processing for PIV of single- and two-phase flows over reflecting objects

    Energy Technology Data Exchange (ETDEWEB)

    Deen, Niels G.; Willems, Paul; Sint Annaland, Martin van; Kuipers, J.A.M.; Lammertink, Rob G.H.; Kemperman, Antoine J.B.; Wessling, Matthias; Meer, Walter G.J. van der [University of Twente, Faculty of Science and Technology, Institute of Mechanics, Processes and Control Twente (IMPACT), Enschede (Netherlands)

    2010-08-15

    A novel image pre-processing scheme for PIV of single- and two-phase flows over reflecting objects which does not require the use of additional hardware is discussed. The approach for single-phase flow consists of image normalization and intensity stretching followed by background subtraction. For two-phase flow, an additional masking step is added after the background subtraction. The effectiveness of the pre-processing scheme is shown for two examples: PIV of single-phase flow in spacer-filled channels and two-phase flow in these channels. The pre-processing scheme increased the displacement peak detectability significantly and produced high quality vector fields, without the use of additional hardware. (orig.)

  11. An adaptive preprocessing algorithm for low bitrate video coding

    Institute of Scientific and Technical Information of China (English)

    LI Mao-quan; XU Zheng-quan

    2006-01-01

    At low bitrate, all block discrete cosine transform (BDCT) based video coding algorithms suffer from visible blocking and ringing artifacts in the reconstructed images because the quantization is too coarse and high frequency DCT coefficients are inclined to be quantized to zeros. Preprocessing algorithms can enhance coding efficiency and thus reduce the likelihood of blocking artifacts and ringing artifacts generated in the video coding process by applying a low-pass filter before video encoding to remove some relatively insignificant high frequent components. In this paper, we introduce a new adaptive preprocessing algorithm, which employs an improved bilateral filter to provide adaptive edge-preserving low-pass filtering which is adjusted according to the quantization parameters. Whether at low or high bit rate, the preprocessing can provide proper filtering to make the video encoder more efficient and have better reconstructed image quality. Experimental results demonstrate that our proposed preprocessing algorithm can significantly improve both subjective and objective quality.

  12. Preprocessing Algorithm for Deciphering Historical Inscriptions Using String Metric

    Directory of Open Access Journals (Sweden)

    Lorand Lehel Toth

    2016-07-01

    Full Text Available The article presents the improvements in the preprocessing part of the deciphering method (shortly preprocessing algorithm for historical inscriptions of unknown origin. Glyphs used in historical inscriptions changed through time; therefore, various versions of the same script may contain different glyphs for each grapheme. The purpose of the preprocessing algorithm is reducing the running time of the deciphering process by filtering out the less probable interpretations of the examined inscription. However, the first version of the preprocessing algorithm leads incorrect outcome or no result in the output in certain cases. Therefore, its improved version was developed to find the most similar words in the dictionary by relaying the search conditions more accurately, but still computationally effectively. Moreover, a sophisticated similarity metric used to determine the possible meaning of the unknown inscription is introduced. The results of the evaluations are also detailed.

  13. REMINDER: Saved Leave Scheme (SLS)

    CERN Multimedia

    2003-01-01

    Transfer of leave to saved leave accounts Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'* annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No 22B) can be transferred to the saved leave account at the end of the leave year (30 September). We remind you that unused leave of all those taking part in the saved leave scheme at the closure of the leave year accounts is transferred automatically to the saved leave account on that date. Therefore, staff members have no administrative steps to take. In addition, the transfer, which eliminates the risk of omitting to request leave transfers and rules out calculation errors in transfer requests, will be clearly shown in the list of leave transactions that can be consulted in EDH from October 2003 onwards. Furthermore, this automatic leave transfer optimizes staff members' chances of benefiting from a saved leave bonus provided that they ar...

  14. Preprocessing for classification of thermograms in breast cancer detection

    Science.gov (United States)

    Neumann, Łukasz; Nowak, Robert M.; Okuniewski, Rafał; Oleszkiewicz, Witold; Cichosz, Paweł; Jagodziński, Dariusz; Matysiewicz, Mateusz

    2016-09-01

    Performance of binary classification of breast cancer suffers from high imbalance between classes. In this article we present the preprocessing module designed to negate the discrepancy in training examples. Preprocessing module is based on standardization, Synthetic Minority Oversampling Technique and undersampling. We show how each algorithm influences classification accuracy. Results indicate that described module improves overall Area Under Curve up to 10% on the tested dataset. Furthermore we propose other methods of dealing with imbalanced datasets in breast cancer classification.

  15. A data preprocessing strategy for metabolomics to reduce the mask effect in data analysis.

    Science.gov (United States)

    Yang, Jun; Zhao, Xinjie; Lu, Xin; Lin, Xiaohui; Xu, Guowang

    2015-01-01

    HighlightsDeveloped a data preprocessing strategy to cope with missing values and mask effects in data analysis from high variation of abundant metabolites.A new method- 'x-VAST' was developed to amend the measurement deviation enlargement.Applying the above strategy, several low abundant masked differential metabolites were rescued. Metabolomics is a booming research field. Its success highly relies on the discovery of differential metabolites by comparing different data sets (for example, patients vs. controls). One of the challenges is that differences of the low abundant metabolites between groups are often masked by the high variation of abundant metabolites. In order to solve this challenge, a novel data preprocessing strategy consisting of three steps was proposed in this study. In step 1, a 'modified 80%' rule was used to reduce effect of missing values; in step 2, unit-variance and Pareto scaling methods were used to reduce the mask effect from the abundant metabolites. In step 3, in order to fix the adverse effect of scaling, stability information of the variables deduced from intensity information and the class information, was used to assign suitable weights to the variables. When applying to an LC/MS based metabolomics dataset from chronic hepatitis B patients study and two simulated datasets, the mask effect was found to be partially eliminated and several new low abundant differential metabolites were rescued.

  16. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  17. Data pre-processing for web log mining: Case study of commercial bank website usage analysis

    Directory of Open Access Journals (Sweden)

    Jozef Kapusta

    2013-01-01

    Full Text Available We use data cleaning, integration, reduction and data conversion methods in the pre-processing level of data analysis. Data processing techniques improve the overall quality of the patterns mined. The paper describes using of standard pre-processing methods for preparing data of the commercial bank website in the form of the log file obtained from the web server. Data cleaning, as the simplest step of data pre-processing, is non–trivial as the analysed content is highly specific. We had to deal with the problem of frequent changes of the content and even frequent changes of the structure. Regular changes in the structure make use of the sitemap impossible. We presented approaches how to deal with this problem. We were able to create the sitemap dynamically just based on the content of the log file. In this case study, we also examined just the one part of the website over the standard analysis of an entire website, as we did not have access to all log files for the security reason. As the result, the traditional practices had to be adapted for this special case. Analysing just the small fraction of the website resulted in the short session time of regular visitors. We were not able to use recommended methods to determine the optimal value of session time. Therefore, we proposed new methods based on outliers identification for raising the accuracy of the session length in this paper.

  18. Net savings

    Energy Technology Data Exchange (ETDEWEB)

    Roche, P.

    2001-02-01

    The state of e-commerce in the Canadian upstream oil and natural gas sector is examined in an effort to discover the extent to which the .com economy has penetrated the marketplace. The overall assessment is that although the situation varies from producer to producer and process to process, a bustling digital marketplace in the Canadian oil business has yet to emerge. Nevertheless, there are several examples of companies using e-business tools to minimize technology staffing and to eliminate wasteful practices. Initiatives cited include streamlining of supply chains to cut handling costs, using application service providers to trim information technology budgets, and adopting electronic joint interest billing to save on printing, postage and re-entering data. Most notable efforts have been made by companies such as BXL Energy Limited and Genesis Exploration Limited, both of which are boosting efficiency on the inside by contracting out data storage and software applications. For example, BXL has replaced its microfilm log library occupying six cabinets, and totalling about 9,000 lbs., by a fibre optic line. All applications can now be run from a laptop which weighs three to four pounds. In a similar vein, Genesis Exploration started using application service providers (ASPs) to avoid the cost and hassle of buying and maintaining major software applications in-house. By accessing the ASPs, Genesis staff can run software without buying or installing it on their own computers. In yet another example of cutting information technology costs, Pengrowth Corporation has its network administration done remotely over the Internet by Northwest Digital Systems (NWD). As far as the industry at large is concerned, the answer appears to be in a digital marketplace specifically tailored to the upstream sector's unique profile. As a start, a study is underway by Deloitte Consulting to explore producer interest in joining or founding an upstream digital marketplace. The study

  19. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... children, parents, and public health professionals. More > Hand Hygiene Saves Lives (5:10) Recommend on Facebook Tweet Share Compartir Hand Hygiene Saves Lives Hand Hygiene Saves Lives Transcript [28 KB, 2 pages] High ...

  20. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... including, children, parents, and public health professionals. More > Hand Hygiene Saves Lives (5:10) Recommend on Facebook Tweet Share Compartir Hand Hygiene Saves Lives Hand Hygiene Saves Lives Transcript [28 KB, 2 pages] High ...

  1. Hand Hygiene Saves Lives

    Science.gov (United States)

    ... including, children, parents, and public health professionals. More > Hand Hygiene Saves Lives (5:10) Recommend on Facebook Tweet Share Compartir Hand Hygiene Saves Lives Hand Hygiene Saves Lives Transcript [28 KB, 2 pages] High ...

  2. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... TV Contact and Feedback Form Download Instructions Explorer: Right-click on the link, choose "Save target as…", save file in desired location. Firefox/Chrome: Right-click on the link, choose "Save link as…", ...

  3. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  4. Optimizing preprocessing and analysis pipelines for single-subject fMRI: 2. Interactions with ICA, PCA, task contrast and inter-subject heterogeneity.

    Science.gov (United States)

    Churchill, Nathan W; Yourganov, Grigori; Oder, Anita; Tam, Fred; Graham, Simon J; Strother, Stephen C

    2012-01-01

    A variety of preprocessing techniques are available to correct subject-dependant artifacts in fMRI, caused by head motion and physiological noise. Although it has been established that the chosen preprocessing steps (or "pipeline") may significantly affect fMRI results, it is not well understood how preprocessing choices interact with other parts of the fMRI experimental design. In this study, we examine how two experimental factors interact with preprocessing: between-subject heterogeneity, and strength of task contrast. Two levels of cognitive contrast were examined in an fMRI adaptation of the Trail-Making Test, with data from young, healthy adults. The importance of standard preprocessing with motion correction, physiological noise correction, motion parameter regression and temporal detrending were examined for the two task contrasts. We also tested subspace estimation using Principal Component Analysis (PCA), and Independent Component Analysis (ICA). Results were obtained for Penalized Discriminant Analysis, and model performance quantified with reproducibility (R) and prediction metrics (P). Simulation methods were also used to test for potential biases from individual-subject optimization. Our results demonstrate that (1) individual pipeline optimization is not significantly more biased than fixed preprocessing. In addition, (2) when applying a fixed pipeline across all subjects, the task contrast significantly affects pipeline performance; in particular, the effects of PCA and ICA models vary with contrast, and are not by themselves optimal preprocessing steps. Also, (3) selecting the optimal pipeline for each subject improves within-subject (P,R) and between-subject overlap, with the weaker cognitive contrast being more sensitive to pipeline optimization. These results demonstrate that sensitivity of fMRI results is influenced not only by preprocessing choices, but also by interactions with other experimental design factors. This paper outlines a

  5. Preprocessing and Quality Control Strategies for Illumina DASL Assay-Based Brain Gene Expression Studies with Semi-Degraded Samples.

    Science.gov (United States)

    Chow, Maggie L; Winn, Mary E; Li, Hai-Ri; April, Craig; Wynshaw-Boris, Anthony; Fan, Jian-Bing; Fu, Xiang-Dong; Courchesne, Eric; Schork, Nicholas J

    2012-01-01

    Available statistical preprocessing or quality control analysis tools for gene expression microarray datasets are known to greatly affect downstream data analysis, especially when degraded samples, unique tissue samples, or novel expression assays are used. It is therefore important to assess the validity and impact of the assumptions built in to preprocessing schemes for a dataset. We developed and assessed a data preprocessing strategy for use with the Illumina DASL-based gene expression assay with partially degraded postmortem prefrontal cortex samples. The samples were obtained from individuals with autism as part of an investigation of the pathogenic factors contributing to autism. Using statistical analysis methods and metrics such as those associated with multivariate distance matrix regression and mean inter-array correlation, we developed a DASL-based assay gene expression preprocessing pipeline to accommodate and detect problems with microarray-based gene expression values obtained with degraded brain samples. Key steps in the pipeline included outlier exclusion, data transformation and normalization, and batch effect and covariate corrections. Our goal was to produce a clean dataset for subsequent downstream differential expression analysis. We ultimately settled on available transformation and normalization algorithms in the R/Bioconductor package lumi based on an assessment of their use in various combinations. A log2-transformed, quantile-normalized, and batch and seizure-corrected procedure was likely the most appropriate for our data. We empirically tested different components of our proposed preprocessing strategy and believe that our results suggest that a preprocessing strategy that effectively identifies outliers, normalizes the data, and corrects for batch effects can be applied to all studies, even those pursued with degraded samples.

  6. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    Directory of Open Access Journals (Sweden)

    Miri eMichaeli

    2012-12-01

    Full Text Available High throughput sequencing (HTS yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig genes, which are variable and often highly mutated. This paper describes Ig-HTS-Cleaner (Ig High Throughput Sequencing Cleaner, a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig-Indel-Identifier (Ig Insertion – Deletion Identifier, a program for identifying legitimate and artifact insertions and/or deletions (indels. Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets.

  7. Preprocessing for Optimization of Probabilistic-Logic Models for Sequence Analysis

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    2009-01-01

    , the original complex models may be used for generating artificial evaluation data by efficient sampling, which can be used in the evaluation, although it does not constitute a foolproof test procedure. These models and evaluation processes are illustrated in the PRISM system developed by other authors, and we...... and approximation are needed. The first steps are taken towards a methodology for optimizing such models by approximations using auxiliary models for preprocessing or splitting them into submodels. Evaluation of such approximating models is challenging as authoritative test data may be sparse. On the other hand...

  8. Analyzing ChIP-seq data: preprocessing, normalization, differential identification, and binding pattern characterization.

    Science.gov (United States)

    Taslim, Cenny; Huang, Kun; Huang, Tim; Lin, Shili

    2012-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-seq) is a high-throughput antibody-based method to study genome-wide protein-DNA binding interactions. ChIP-seq technology allows scientist to obtain more accurate data providing genome-wide coverage with less starting material and in shorter time compared to older ChIP-chip experiments. Herein we describe a step-by-step guideline in analyzing ChIP-seq data including data preprocessing, nonlinear normalization to enable comparison between different samples and experiments, statistical-based method to identify differential binding sites using mixture modeling and local false discovery rates (fdrs), and binding pattern characterization. In addition, we provide a sample analysis of ChIP-seq data using the steps provided in the guideline.

  9. Effect of microaerobic fermentation in preprocessing fibrous lignocellulosic materials.

    Science.gov (United States)

    Alattar, Manar Arica; Green, Terrence R; Henry, Jordan; Gulca, Vitalie; Tizazu, Mikias; Bergstrom, Robby; Popa, Radu

    2012-06-01

    Amending soil with organic matter is common in agricultural and logging practices. Such amendments have benefits to soil fertility and crop yields. These benefits may be increased if material is preprocessed before introduction into soil. We analyzed the efficiency of microaerobic fermentation (MF), also referred to as Bokashi, in preprocessing fibrous lignocellulosic (FLC) organic materials using varying produce amendments and leachate treatments. Adding produce amendments increased leachate production and fermentation rates and decreased the biological oxygen demand of the leachate. Continuously draining leachate without returning it to the fermentors led to acidification and decreased concentrations of polysaccharides (PS) in leachates. PS fragmentation and the production of soluble metabolites and gases stabilized in fermentors in about 2-4 weeks. About 2 % of the carbon content was lost as CO(2). PS degradation rates, upon introduction of processed materials into soil, were similar to unfermented FLC. Our results indicate that MF is insufficient for adequate preprocessing of FLC material.

  10. Data Preprocessing in Cluster Analysis of Gene Expression

    Institute of Scientific and Technical Information of China (English)

    杨春梅; 万柏坤; 高晓峰

    2003-01-01

    Considering that the DNA microarray technology has generated explosive gene expression data and that it is urgent to analyse and to visualize such massive datasets with efficient methods, we investigate the data preprocessing methods used in cluster analysis, normalization or logarithm of the matrix, by using hierarchical clustering, principal component analysis (PCA) and self-organizing maps (SOMs). The results illustrate that when using the Euclidean distance as measuring metrics, logarithm of relative expression level is the best preprocessing method, while data preprocessed by normalization cannot attain the expected results because the data structure is ruined. If there are only a few principal components, the PCA is an effective method to extract the frame structure, while SOMs are more suitable for a specific structure.

  11. Evaluation of standard and advanced preprocessing methods for the univariate analysis of blood serum 1H-NMR spectra.

    Science.gov (United States)

    De Meyer, Tim; Sinnaeve, Davy; Van Gasse, Bjorn; Rietzschel, Ernst-R; De Buyzere, Marc L; Langlois, Michel R; Bekaert, Sofie; Martins, José C; Van Criekinge, Wim

    2010-10-01

    Proton nuclear magnetic resonance ((1)H-NMR)-based metabolomics enables the high-resolution and high-throughput assessment of a broad spectrum of metabolites in biofluids. Despite the straightforward character of the experimental methodology, the analysis of spectral profiles is rather complex, particularly due to the requirement of numerous data preprocessing steps. Here, we evaluate how several of the most common preprocessing procedures affect the subsequent univariate analyses of blood serum spectra, with a particular focus on how the standard methods perform compared to more advanced examples. Carr-Purcell-Meiboom-Gill 1D (1)H spectra were obtained for 240 serum samples from healthy subjects of the Asklepios study. We studied the impact of different preprocessing steps--integral (standard method) and probabilistic quotient normalization; no, equidistant (standard), and adaptive-intelligent binning; mean (standard) and maximum bin intensity data summation--on the resonance intensities of three different types of metabolites: triglycerides, glucose, and creatinine. The effects were evaluated by correlating the differently preprocessed NMR data with the independently measured metabolite concentrations. The analyses revealed that the standard methods performed inferiorly and that a combination of probabilistic quotient normalization after adaptive-intelligent binning and maximum intensity variable definition yielded the best overall results (triglycerides, R = 0.98; glucose, R = 0.76; creatinine, R = 0.70). Therefore, at least in the case of serum metabolomics, these or equivalent methods should be preferred above the standard preprocessing methods, particularly for univariate analyses. Additional optimization of the normalization procedure might further improve the analyses.

  12. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Explorer: Right-click on the link, choose "Save target as…", save file in desired location. Firefox/Chrome: ... Explorer: Right-click on the link, choose "Save target as…", save file in desired location. Firefox/Chrome: ...

  13. Preprocessing of A-scan GPR data based on energy features

    Science.gov (United States)

    Dogan, Mesut; Turhan-Sayan, Gonul

    2016-05-01

    There is an increasing demand for noninvasive real-time detection and classification of buried objects in various civil and military applications. The problem of detection and annihilation of landmines is particularly important due to strong safety concerns. The requirement for a fast real-time decision process is as important as the requirements for high detection rates and low false alarm rates. In this paper, we introduce and demonstrate a computationally simple, timeefficient, energy-based preprocessing approach that can be used in ground penetrating radar (GPR) applications to eliminate reflections from the air-ground boundary and to locate the buried objects, simultaneously, at one easy step. The instantaneous power signals, the total energy values and the cumulative energy curves are extracted from the A-scan GPR data. The cumulative energy curves, in particular, are shown to be useful to detect the presence and location of buried objects in a fast and simple way while preserving the spectral content of the original A-scan data for further steps of physics-based target classification. The proposed method is demonstrated using the GPR data collected at the facilities of IPA Defense, Ankara at outdoor test lanes. Cylindrically shaped plastic containers were buried in fine-medium sand to simulate buried landmines. These plastic containers were half-filled by ammonium nitrate including metal pins. Results of this pilot study are demonstrated to be highly promising to motivate further research for the use of energy-based preprocessing features in landmine detection problem.

  14. River flow forecasting with Artificial Neural Networks using satellite observed precipitation pre-processed with flow length and travel time information: case study of the Ganges river basin

    Directory of Open Access Journals (Sweden)

    M. K. Akhtar

    2009-04-01

    Full Text Available This paper explores the use of flow length and travel time as a pre-processing step for incorporating spatial precipitation information into Artificial Neural Network (ANN models used for river flow forecasting. Spatially distributed precipitation is commonly required when modelling large basins, and it is usually incorporated in distributed physically-based hydrological modelling approaches. However, these modelling approaches are recognised to be quite complex and expensive, especially due to the data collection of multiple inputs and parameters, which vary in space and time. On the other hand, ANN models for flow forecasting are frequently developed only with precipitation and discharge as inputs, usually without taking into consideration the spatial variability of precipitation. Full inclusion of spatially distributed inputs into ANN models still leads to a complex computational process that may not give acceptable results. Therefore, here we present an analysis of the flow length and travel time as a basis for pre-processing remotely sensed (satellite rainfall data. This pre-processed rainfall is used together with local stream flow measurements of previous days as input to ANN models. The case study for this modelling approach is the Ganges river basin. A comparative analysis of multiple ANN models with different hydrological pre-processing is presented. The ANN showed its ability to forecast discharges 3-days ahead with an acceptable accuracy. Within this forecast horizon, the influence of the pre-processed rainfall is marginal, because of dominant influence of strongly auto-correlated discharge inputs. For forecast horizons of 7 to 10 days, the influence of the pre-processed rainfall is noticeable, although the overall model performance deteriorates. The incorporation of remote sensing data of spatially distributed precipitation information as pre-processing step showed to be a promising alternative for the setting-up of ANN models for

  15. Image preprocessing study on KPCA-based face recognition

    Science.gov (United States)

    Li, Xuan; Li, Dehua

    2015-12-01

    Face recognition as an important biometric identification method, with its friendly, natural, convenient advantages, has obtained more and more attention. This paper intends to research a face recognition system including face detection, feature extraction and face recognition, mainly through researching on related theory and the key technology of various preprocessing methods in face detection process, using KPCA method, focuses on the different recognition results in different preprocessing methods. In this paper, we choose YCbCr color space for skin segmentation and choose integral projection for face location. We use erosion and dilation of the opening and closing operation and illumination compensation method to preprocess face images, and then use the face recognition method based on kernel principal component analysis method for analysis and research, and the experiments were carried out using the typical face database. The algorithms experiment on MATLAB platform. Experimental results show that integration of the kernel method based on PCA algorithm under certain conditions make the extracted features represent the original image information better for using nonlinear feature extraction method, which can obtain higher recognition rate. In the image preprocessing stage, we found that images under various operations may appear different results, so as to obtain different recognition rate in recognition stage. At the same time, in the process of the kernel principal component analysis, the value of the power of the polynomial function can affect the recognition result.

  16. Pre-Processing Rules for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den

    2003-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a network’s graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique siz

  17. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique

  18. The minimal preprocessing pipelines for the Human Connectome Project.

    Science.gov (United States)

    Glasser, Matthew F; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark

    2013-10-15

    The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinate spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP's acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements to the pipelines.

  19. OPSN: The IMS COMSYS 1 and 2 Data Preprocessing System.

    Science.gov (United States)

    Yu, John

    The Instructional Management System (IMS) developed by the Southwest Regional Laboratory (SWRL) processes student and teacher-generated data through the use of an optical scanner that produces a magnetic tape (Scan Tape) for input to IMS. A series of computer routines, OPSN, preprocesses the Scan Tape and prepares the data for transmission to the…

  20. An effective measured data preprocessing method in electrical impedance tomography.

    Science.gov (United States)

    Yu, Chenglong; Yue, Shihong; Wang, Jianpei; Wang, Huaxiang

    2014-01-01

    As an advanced process detection technology, electrical impedance tomography (EIT) has widely been paid attention to and studied in the industrial fields. But the EIT techniques are greatly limited to the low spatial resolutions. This problem may result from the incorrect preprocessing of measuring data and lack of general criterion to evaluate different preprocessing processes. In this paper, an EIT data preprocessing method is proposed by all rooting measured data and evaluated by two constructed indexes based on all rooted EIT measured data. By finding the optimums of the two indexes, the proposed method can be applied to improve the EIT imaging spatial resolutions. In terms of a theoretical model, the optimal rooting times of the two indexes range in [0.23, 0.33] and in [0.22, 0.35], respectively. Moreover, these factors that affect the correctness of the proposed method are generally analyzed. The measuring data preprocessing is necessary and helpful for any imaging process. Thus, the proposed method can be generally and widely used in any imaging process. Experimental results validate the two proposed indexes.

  1. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique

  2. The precautionary savings motive and household savings

    NARCIS (Netherlands)

    Mastrogiacomo, Mauro; Alessie, Rob

    2014-01-01

    We quantified the relative importance of the precautionary motive in determining savings. Existing empirical evidence suggests that the impact of precautionary savings is small if one uses a subjective (based on self-reported expectations) measure of uncertainty about next year income. However, othe

  3. Columbus Saves: Saving Money in Ohio

    Science.gov (United States)

    Shockey, Susan

    2004-01-01

    The "Columbus Saves" educational program is a broad-based community coalition made up of more than 40 local organizations from the education, nonprofit, government, faith-based, and private sectors. Common goals of partners in reaching Columbus, Ohio's 1.5 million residents are to: (a) promote increased savings through education and…

  4. Save a life--save the world.

    Science.gov (United States)

    Fruh, Sharon; Jezek, Kenda

    2010-01-01

    The ancient Hebrew principle of Pikuach Nephesh, "He who saves a life saves the world entire," took precedence in the healing ministry of Jesus. Today, when Christian nurses persist in serving others, striving to achieve compassion and professional excellence, they too uphold the law of "preserving life."

  5. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms.

    Science.gov (United States)

    Sundareshan, Malur K; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-10

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the

  6. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms

    Science.gov (United States)

    Sundareshan, Malur K.; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-01

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the

  7. 医院中央空调低碳节能技术的改造措施%Remodeling steps of low carbon energy technology saving for central air conditioning in hospital

    Institute of Scientific and Technical Information of China (English)

    李静; 王金荣; 冯凯林; 郭善水; 邓泽江

    2016-01-01

    Through the analysis of building energy saving at home and broad, we conduct research to explore effective ways of managing, running and energy saving for hospital central air conditioning. With the rapid development of the national economy, building energy consumption has proportionally increased compared to the overall energy consump-tion cost; the share of building energy saving development in China is gradually increasing to compensate for the in-creased energy consumption cost. Effective management of energy saving for central air conditioning will directly influ-ence the realization of building energy-saving goals. After upgrading central air conditioning with low carbon energy and constructing automation systems in hospital, we can save significant amount of money each year and reduce operat-ing costs for the hospital, providing support for a virtuous growth cycle in the long-term development of the hospital.%通过对国内外建筑节能情况分析,掌握现代建筑节能的切入点,结合医院中央空调设备、运行、管理进行节能研究。随着国民经济的飞速发展,建筑能耗占整个能耗的比例逐渐提高,建筑节能在我国的可持续发展中所占比重也在逐渐加大,中央空调设备运行效率的高低、运行策略的好坏都直接影响着能否实现建筑节能的目标。医院中央空调低碳节能与楼宇自控系统升级改造工程完成后,可以为医院每年节约可观的资金,降低医院的运营成本,为医院的良性循环与长远发展提供支持与帮助。

  8. ARC Code TI: Save

    Data.gov (United States)

    National Aeronautics and Space Administration — Save is a framework for implementing highly available network-accessible services. Save consists of a command-line utility and a small set of extensions for the...

  9. Web日志挖掘数据预处理研究%Data Preprocessing of Web Log Mining

    Institute of Scientific and Technical Information of China (English)

    何波; 涂飞; 程勇军

    2011-01-01

    数据预处理在Web日志挖掘过程中起着至关重要的作用.论文分析了Web日志挖掘数据预处理的主要步骤,设计了用户识别、访问操作识别和路径完善三个步骤的关键算法.实验结果表明,设计的关键算法是有效的.%Data preprocessing plays an essential role in the process of Web log mining. This paper analyses the steps of data preprocessing, and designs the key algorithms of user identification, session identification and path completion. It is proved that the key algorithms are effective.

  10. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  11. An Efficient and Configurable Preprocessing Algorithm to Improve Stability Analysis.

    Science.gov (United States)

    Sesia, Ilaria; Cantoni, Elena; Cernigliaro, Alice; Signorile, Giovanna; Fantino, Gianluca; Tavella, Patrizia

    2016-04-01

    The Allan variance (AVAR) is widely used to measure the stability of experimental time series. Specifically, AVAR is commonly used in space applications such as monitoring the clocks of the global navigation satellite systems (GNSSs). In these applications, the experimental data present some peculiar aspects which are not generally encountered when the measurements are carried out in a laboratory. Space clocks' data can in fact present outliers, jumps, and missing values, which corrupt the clock characterization. Therefore, an efficient preprocessing is fundamental to ensure a proper data analysis and improve the stability estimation performed with the AVAR or other similar variances. In this work, we propose a preprocessing algorithm and its implementation in a robust software code (in MATLAB language) able to deal with time series of experimental data affected by nonstationarities and missing data; our method is properly detecting and removing anomalous behaviors, hence making the subsequent stability analysis more reliable.

  12. Adaptive fingerprint image enhancement with emphasis on preprocessing of data.

    Science.gov (United States)

    Bartůnek, Josef Ström; Nilsson, Mikael; Sällberg, Benny; Claesson, Ingvar

    2013-02-01

    This article proposes several improvements to an adaptive fingerprint enhancement method that is based on contextual filtering. The term adaptive implies that parameters of the method are automatically adjusted based on the input fingerprint image. Five processing blocks comprise the adaptive fingerprint enhancement method, where four of these blocks are updated in our proposed system. Hence, the proposed overall system is novel. The four updated processing blocks are: 1) preprocessing; 2) global analysis; 3) local analysis; and 4) matched filtering. In the preprocessing and local analysis blocks, a nonlinear dynamic range adjustment method is used. In the global analysis and matched filtering blocks, different forms of order statistical filters are applied. These processing blocks yield an improved and new adaptive fingerprint image processing method. The performance of the updated processing blocks is presented in the evaluation part of this paper. The algorithm is evaluated toward the NIST developed NBIS software for fingerprint recognition on FVC databases.

  13. ITSG-Grace2016 data preprocessing methodologies revisited: impact of using Level-1A data products

    Science.gov (United States)

    Klinger, Beate; Mayer-Gürr, Torsten

    2017-04-01

    For the ITSG-Grace2016 release, the gravity field recovery is based on the use of official GRACE (Gravity Recovery and Climate Experiment) Level-1B data products, generated by the Jet Propulsion Laboratory (JPL). Before gravity field recovery, the Level-1B instrument data are preprocessed. This data preprocessing step includes the combination of Level-1B star camera (SCA1B) and angular acceleration (ACC1B) data for an improved attitude determination (sensor fusion), instrument data screening and ACC1B data calibration. Based on a Level-1A test dataset, provided for individual month throughout the GRACE period by the Center of Space Research at the University of Texas at Austin (UTCSR), the impact of using Level-1A instead of Level-1B data products within the ITSG-Grace2016 processing chain is analyzed. We discuss (1) the attitude determination through an optimal combination of SCA1A and ACC1A data using our sensor fusion approach, (2) the impact of the new attitude product on temporal gravity field solutions, and (3) possible benefits of using Level-1A data for instrument data screening and calibration. As the GRACE mission is currently reaching its end-of-life, the presented work aims not only at a better understanding of GRACE science data to reduce the impact of possible error sources on the gravity field recovery, but it also aims at preparing Level-1A data handling capabilities for the GRACE Follow-On mission.

  14. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    Science.gov (United States)

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  15. Effective Preprocessing Procedures Virtually Eliminate Distance-Dependent Motion Artifacts in Resting State FMRI.

    Science.gov (United States)

    Jo, Hang Joon; Gotts, Stephen J; Reynolds, Richard C; Bandettini, Peter A; Martin, Alex; Cox, Robert W; Saad, Ziad S

    2013-05-21

    Artifactual sources of resting-state (RS) FMRI can originate from head motion, physiology, and hardware. Of these sources, motion has received considerable attention and was found to induce corrupting effects by differentially biasing correlations between regions depending on their distance. Numerous corrective approaches have relied on the identification and censoring of high-motion time points and the use of the brain-wide average time series as a nuisance regressor to which the data are orthogonalized (Global Signal Regression, GSReg). We first replicate the previously reported head-motion bias on correlation coefficients using data generously contributed by Power et al. (2012). We then show that while motion can be the source of artifact in correlations, the distance-dependent bias-taken to be a manifestation of the motion effect on correlation-is exacerbated by the use of GSReg. Put differently, correlation estimates obtained after GSReg are more susceptible to the presence of motion and by extension to the levels of censoring. More generally, the effect of motion on correlation estimates depends on the preprocessing steps leading to the correlation estimate, with certain approaches performing markedly worse than others. For this purpose, we consider various models for RS FMRI preprocessing and show that WMeLOCAL, as subset of the ANATICOR discussed by Jo et al. (2010), denoising approach results in minimal sensitivity to motion and reduces by extension the dependence of correlation results on censoring.

  16. Effective Preprocessing Procedures Virtually Eliminate Distance-Dependent Motion Artifacts in Resting State FMRI

    Directory of Open Access Journals (Sweden)

    Hang Joon Jo

    2013-01-01

    Full Text Available Artifactual sources of resting-state (RS FMRI can originate from head motion, physiology, and hardware. Of these sources, motion has received considerable attention and was found to induce corrupting effects by differentially biasing correlations between regions depending on their distance. Numerous corrective approaches have relied on the identification and censoring of high-motion time points and the use of the brain-wide average time series as a nuisance regressor to which the data are orthogonalized (Global Signal Regression, GSReg. We replicate the previously reported head-motion bias on correlation coefficients and then show that while motion can be the source of artifact in correlations, the distance-dependent bias is exacerbated by GSReg. Put differently, correlation estimates obtained after GSReg are more susceptible to the presence of motion and by extension to the levels of censoring. More generally, the effect of motion on correlation estimates depends on the preprocessing steps leading to the correlation estimate, with certain approaches performing markedly worse than others. For this purpose, we consider various models for RS FMRI preprocessing and show that the local white matter regressor (WMeLOCAL, a subset of ANATICOR, results in minimal sensitivity to motion and reduces by extension the dependence of correlation results on censoring.

  17. Linguistic Preprocessing and Tagging for Problem Report Trend Analysis

    Science.gov (United States)

    Beil, Robert J.; Malin, Jane T.

    2012-01-01

    Mr. Robert Beil, Systems Engineer at Kennedy Space Center (KSC), requested the NASA Engineering and Safety Center (NESC) develop a prototype tool suite that combines complementary software technology used at Johnson Space Center (JSC) and KSC for problem report preprocessing and semantic tag extraction, to improve input to data mining and trend analysis. This document contains the outcome of the assessment and the Findings, Observations and NESC Recommendations.

  18. Research on Digital Watermark Using Pre-Processing Technology

    Institute of Scientific and Technical Information of China (English)

    Ru Guo-bao; Ru Guo-bao; Niu Hui-fang; Niu Hui-fang; Yang Rui; Yang Rui; Sun Hong; Sun Hong; Shi Hong-ling; Shi Hong-ling; Huang Tian-xi; Huang Tian-xi

    2003-01-01

    We have realized a watermark embedding system based on audio perceptual masking and brought forward a watermark detection system using pre-processing technology.We can detect watermark from watermarked audio without original audio by using this method. The results have indicated that this embedding and detecting method is robust, on the premise of not affecting the hearing quality, it can resist those attacks such as MPEG compressing, filtering and adding white noise.

  19. Biosignal data preprocessing: a voice pathology detection application

    Directory of Open Access Journals (Sweden)

    Genaro Daza Santacoloma

    2010-05-01

    Full Text Available A methodology for biosignal data preprocessing is presented. Experiments were mainly carried out with voice signals for automa- tically detecting pathologies. The proposed methodology was structured on 3 elements: outlier detection, normality verification and distribution transformation. It improved classification performance if basic assumptions about data structure were met. This entailed a more accurate detection of voice pathologies and it reduced the computational complexity of classification algorithms. Classification performance improved by 15%.

  20. Integration of geometric modeling and advanced finite element preprocessing

    Science.gov (United States)

    Shephard, Mark S.; Finnigan, Peter M.

    1987-01-01

    The structure to a geometry based finite element preprocessing system is presented. The key features of the system are the use of geometric operators to support all geometric calculations required for analysis model generation, and the use of a hierarchic boundary based data structure for the major data sets within the system. The approach presented can support the finite element modeling procedures used today as well as the fully automated procedures under development.

  1. Impact of functional MRI data preprocessing pipeline on default-mode network detectability in patients with disorders of consciousness

    Directory of Open Access Journals (Sweden)

    Adrian eAndronache

    2013-08-01

    Full Text Available An emerging application of resting-state functional MRI is the study of patients with disorders of consciousness (DoC, where integrity of default-mode network (DMN activity is associated to the clinical level of preservation of consciousness. Due to the inherent inability to follow verbal instructions, arousal induced by scanning noise and postural pain, these patients tend to exhibit substantial levels of movement. This results in spurious, non-neural fluctuations of the blood-oxygen level-dependent (BOLD signal, which impair the evaluation of residual functional connectivity. Here, the effect of data preprocessing choices on the detectability of the DMN was systematically evaluated in a representative cohort of 30 clinically and etiologically heterogeneous DoC patients and 33 healthy controls. Starting from a standard preprocessing pipeline, additional steps were gradually inserted, namely band-pass filtering, removal of co-variance with the movement vectors, removal of co-variance with the global brain parenchyma signal, rejection of realignment outlier volumes and ventricle masking. Both independent-component analysis (ICA and seed-based analysis (SBA were performed, and DMN detectability was assessed quantitatively as well as visually. The results of the present study strongly show that the detection of DMN activity in the sub-optimal fMRI series acquired on DoC patients is contingent on the use of adequate filtering steps. ICA and SBA are differently affected but give convergent findings for high-grade preprocessing. We propose that future studies in this area should adopt the described preprocessing procedures as a minimum standard to reduce the probability of wrongly inferring that DMN activity is absent.

  2. Impact of functional MRI data preprocessing pipeline on default-mode network detectability in patients with disorders of consciousness.

    Science.gov (United States)

    Andronache, Adrian; Rosazza, Cristina; Sattin, Davide; Leonardi, Matilde; D'Incerti, Ludovico; Minati, Ludovico

    2013-01-01

    An emerging application of resting-state functional MRI (rs-fMRI) is the study of patients with disorders of consciousness (DoC), where integrity of default-mode network (DMN) activity is associated to the clinical level of preservation of consciousness. Due to the inherent inability to follow verbal instructions, arousal induced by scanning noise and postural pain, these patients tend to exhibit substantial levels of movement. This results in spurious, non-neural fluctuations of the rs-fMRI signal, which impair the evaluation of residual functional connectivity. Here, the effect of data preprocessing choices on the detectability of the DMN was systematically evaluated in a representative cohort of 30 clinically and etiologically heterogeneous DoC patients and 33 healthy controls. Starting from a standard preprocessing pipeline, additional steps were gradually inserted, namely band-pass filtering (BPF), removal of co-variance with the movement vectors, removal of co-variance with the global brain parenchyma signal, rejection of realignment outlier volumes and ventricle masking. Both independent-component analysis (ICA) and seed-based analysis (SBA) were performed, and DMN detectability was assessed quantitatively as well as visually. The results of the present study strongly show that the detection of DMN activity in the sub-optimal fMRI series acquired on DoC patients is contingent on the use of adequate filtering steps. ICA and SBA are differently affected but give convergent findings for high-grade preprocessing. We propose that future studies in this area should adopt the described preprocessing procedures as a minimum standard to reduce the probability of wrongly inferring that DMN activity is absent.

  3. Review of feed forward neural network classification preprocessing techniques

    Science.gov (United States)

    Asadi, Roya; Kareem, Sameem Abdul

    2014-06-01

    The best feature of artificial intelligent Feed Forward Neural Network (FFNN) classification models is learning of input data through their weights. Data preprocessing and pre-training are the contributing factors in developing efficient techniques for low training time and high accuracy of classification. In this study, we investigate and review the powerful preprocessing functions of the FFNN models. Currently initialization of the weights is at random which is the main source of problems. Multilayer auto-encoder networks as the latest technique like other related techniques is unable to solve the problems. Weight Linear Analysis (WLA) is a combination of data pre-processing and pre-training to generate real weights through the use of normalized input values. The FFNN model by using the WLA increases classification accuracy and improve training time in a single epoch without any training cycle, the gradient of the mean square error function, updating the weights. The results of comparison and evaluation show that the WLA is a powerful technique in the FFNN classification area yet.

  4. A Survey on Preprocessing Methods for Web Usage Data

    Directory of Open Access Journals (Sweden)

    V.Chitraa

    2010-03-01

    Full Text Available World Wide Web is a huge repository of web pages and links. It provides abundance of information for the Internet users. The growth of web is tremendous as approximately one million pages are added daily. Users’ accesses are recorded in web logs. Because of the tremendous usage of web, the web log files are growing at a faster rate and the size is becoming huge. Web data mining is the application of data mining techniques in web data. Web Usage Mining applies mining techniques in log data to extract the behavior of users which is used in various applications like personalized services, adaptive web sites, customer profiling, prefetching, creating attractive web sites etc., Web usage mining consists of three phases preprocessing, pattern discovery and pattern analysis. Web log data is usually noisy and ambiguous and preprocessing is an important process before mining. For discovering patterns sessions are to be constructed efficiently. This paper reviews existing work done in the preprocessing stage. A brief overview of various data mining techniques for discovering patterns, and pattern analysis are discussed. Finally a glimpse of various applications of web usage mining is also presented.

  5. A Stereo Music Preprocessing Scheme for Cochlear Implant Users.

    Science.gov (United States)

    Buyens, Wim; van Dijk, Bas; Wouters, Jan; Moonen, Marc

    2015-10-01

    Listening to music is still one of the more challenging aspects of using a cochlear implant (CI) for most users. Simple musical structures, a clear rhythm/beat, and lyrics that are easy to follow are among the top factors contributing to music appreciation for CI users. Modifying the audio mix of complex music potentially improves music enjoyment in CI users. A stereo music preprocessing scheme is described in which vocals, drums, and bass are emphasized based on the representation of the harmonic and the percussive components in the input spectrogram, combined with the spatial allocation of instruments in typical stereo recordings. The scheme is assessed with postlingually deafened CI subjects (N = 7) using pop/rock music excerpts with different complexity levels. The scheme is capable of modifying relative instrument level settings, with the aim of improving music appreciation in CI users, and allows individual preference adjustments. The assessment with CI subjects confirms the preference for more emphasis on vocals, drums, and bass as offered by the preprocessing scheme, especially for songs with higher complexity. The stereo music preprocessing scheme has the potential to improve music enjoyment in CI users by modifying the audio mix in widespread (stereo) music recordings. Since music enjoyment in CI users is generally poor, this scheme can assist the music listening experience of CI users as a training or rehabilitation tool.

  6. Simultaneous data pre-processing and SVM classification model selection based on a parallel genetic algorithm applied to spectroscopic data of olive oils.

    Science.gov (United States)

    Devos, Olivier; Downey, Gerard; Duponchel, Ludovic

    2014-04-01

    Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates.

  7. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  8. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  9. UNMIXING-BASED DENOISING AS A PRE-PROCESSING STEP FOR CORAL REEF ANALYSIS

    Directory of Open Access Journals (Sweden)

    D. Cerra

    2017-05-01

    Full Text Available Coral reefs, among the world’s most biodiverse and productive submerged habitats, have faced several mass bleaching events due to climate change during the past 35 years. In the course of this century, global warming and ocean acidification are expected to cause corals to become increasingly rare on reef systems. This will result in a sharp decrease in the biodiversity of reef communities and carbonate reef structures. Coral reefs may be mapped, characterized and monitored through remote sensing. Hyperspectral images in particular excel in being used in coral monitoring, being characterized by very rich spectral information, which results in a strong discrimination power to characterize a target of interest, and separate healthy corals from bleached ones. Being submerged habitats, coral reef systems are difficult to analyse in airborne or satellite images, as relevant information is conveyed in bands in the blue range which exhibit lower signal-to-noise ratio (SNR with respect to other spectral ranges; furthermore, water is absorbing most of the incident solar radiation, further decreasing the SNR. Derivative features, which are important in coral analysis, result greatly affected by the resulting noise present in relevant spectral bands, justifying the need of new denoising techniques able to keep local spatial and spectral features. In this paper, Unmixing-based Denoising (UBD is used to enable analysis of a hyperspectral image acquired over a coral reef system in the Red Sea based on derivative features. UBD reconstructs pixelwise a dataset with reduced noise effects, by forcing each spectrum to a linear combination of other reference spectra, exploiting the high dimensionality of hyperspectral datasets. Results show clear enhancements with respect to traditional denoising methods based on spatial and spectral smoothing, facilitating the coral detection task.

  10. Unmixing-Based Denoising as a Pre-Processing Step for Coral Reef Analysis

    Science.gov (United States)

    Cerra, D.; Traganos, D.; Gege, P.; Reinartz, P.

    2017-05-01

    Coral reefs, among the world's most biodiverse and productive submerged habitats, have faced several mass bleaching events due to climate change during the past 35 years. In the course of this century, global warming and ocean acidification are expected to cause corals to become increasingly rare on reef systems. This will result in a sharp decrease in the biodiversity of reef communities and carbonate reef structures. Coral reefs may be mapped, characterized and monitored through remote sensing. Hyperspectral images in particular excel in being used in coral monitoring, being characterized by very rich spectral information, which results in a strong discrimination power to characterize a target of interest, and separate healthy corals from bleached ones. Being submerged habitats, coral reef systems are difficult to analyse in airborne or satellite images, as relevant information is conveyed in bands in the blue range which exhibit lower signal-to-noise ratio (SNR) with respect to other spectral ranges; furthermore, water is absorbing most of the incident solar radiation, further decreasing the SNR. Derivative features, which are important in coral analysis, result greatly affected by the resulting noise present in relevant spectral bands, justifying the need of new denoising techniques able to keep local spatial and spectral features. In this paper, Unmixing-based Denoising (UBD) is used to enable analysis of a hyperspectral image acquired over a coral reef system in the Red Sea based on derivative features. UBD reconstructs pixelwise a dataset with reduced noise effects, by forcing each spectrum to a linear combination of other reference spectra, exploiting the high dimensionality of hyperspectral datasets. Results show clear enhancements with respect to traditional denoising methods based on spatial and spectral smoothing, facilitating the coral detection task.

  11. A data preprocessing strategy for metabolomics to reduce the mask effect in data analysis

    Directory of Open Access Journals (Sweden)

    Jun eYang

    2015-02-01

    Full Text Available Metabolomics is a booming research field. Its success highly relies on the discovery of differential metabolites by comparing different data sets (for example, patients vs. controls. One of the challenges is that differences of the low abundant metabolites between groups are often masked by the high variation of abundant metabolites -. In order to solve this challenge, a novel data preprocessing strategy consisting of 3 steps was proposed in this study. In step 1, a ‘modified 80%’ rule was used to reduce effect of missing values; in step 2, unit-variance and Pareto scaling methods were used to reduce the mask effect from the abundant metabolites. In step 3, in order to fix the adverse effect of scaling, stability information of the variables deduced from intensity information and the class information, was used to assign suitable weights to the variables. When applying to an LC/MS based metabolomics dataset from chronic hepatitis B patients study and two simulated datasets, the mask effect was found to be partially eliminated and several new low abundant differential metabolites were rescued.

  12. Preprocessing Techniques for High-Efficiency Data Compression in Wireless Multimedia Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junho Park

    2015-01-01

    Full Text Available We have proposed preprocessing techniques for high-efficiency data compression in wireless multimedia sensor networks. To do this, we analyzed the characteristics of multimedia data under the environment of wireless multimedia sensor networks. The proposed preprocessing techniques consider the characteristics of sensed multimedia data to perform the first stage preprocessing by deleting the low priority bits that do not affect the image quality. The second stage preprocessing is also performed for the undeleted high priority bits. By performing these two-stage preprocessing techniques, it is possible to reduce the multimedia data size in large. To show the superiority of our techniques, we simulated the existing multimedia data compression scheme with/without our preprocessing techniques. Our experimental results show that our proposed techniques increase compression ratio while reducing compression operations compared to the existing compression scheme without preprocessing techniques.

  13. SAVED LEAVE BONUS

    CERN Multimedia

    Division des ressources humaines

    2000-01-01

    Staff members participating in the RSL programme are entitled to one additional day of saved leave for each full period of 20 days remaining in their saved leave account on 31 December 1999.Allowing some time for all concerned to make sure that their periods of leave taken in 1999 are properly registered, HR division will proceed with the crediting of the appropriate number of days in the saved leave accounts from 25 January 2000.Human Resources DivisionTel.73359

  14. Saving for Future

    Institute of Scientific and Technical Information of China (English)

    蔡文莎

    2009-01-01

    阅读下面的短文,并根据短文后的要求答题。(注意问题后的词数要求) Learning to save money when you're young is an important lesson. All good lessons and habits begin early, and saving is a skill that everyone needs. Many people--adults included--do not have a good sense of saving for the long run.

  15. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Center (EOC) 101 Emergency Operations Center CDC Laboratory Science: Mission Critical Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home ...

  16. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Center (EOC) 101 Emergency Operations Center CDC Laboratory Science: Mission Critical Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home Smoke- ...

  17. PathoQC: Computationally Efficient Read Preprocessing and Quality Control for High-Throughput Sequencing Data Sets.

    Science.gov (United States)

    Hong, Changjin; Manimaran, Solaiappan; Johnson, William Evan

    2014-01-01

    Quality control and read preprocessing are critical steps in the analysis of data sets generated from high-throughput genomic screens. In the most extreme cases, improper preprocessing can negatively affect downstream analyses and may lead to incorrect biological conclusions. Here, we present PathoQC, a streamlined toolkit that seamlessly combines the benefits of several popular quality control software approaches for preprocessing next-generation sequencing data. PathoQC provides a variety of quality control options appropriate for most high-throughput sequencing applications. PathoQC is primarily developed as a module in the PathoScope software suite for metagenomic analysis. However, PathoQC is also available as an open-source Python module that can run as a stand-alone application or can be easily integrated into any bioinformatics workflow. PathoQC achieves high performance by supporting parallel computation and is an effective tool that removes technical sequencing artifacts and facilitates robust downstream analysis. The PathoQC software package is available at http://sourceforge.net/projects/PathoScope/.

  18. The Illusory Effects of Saving Incentives on Saving

    OpenAIRE

    Engen, Eric M.; William G. Gale; John Karl Scholz

    1996-01-01

    The authors evaluate research on how tax-based saving incentives (IRAs and 401(k)s) affect saving. Previous research overstates the impact of the incentives on saving by failing to account for several issues: households with saving incentives have stronger tastes for saving than others; saving incentives have interacted with debt, nonfinancial assets, financial markets, and pensions; and saving incentives represent pretax balances, whereas taxable accounts represent posttax balances. Accounti...

  19. Energy Savings Lifetimes and Persistence

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Ian M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schiller, Steven R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Billingsley, Megan A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-02-01

    This technical brief explains the concepts of energy savings lifetimes and savings persistence and discusses how program administrators use these factors to calculate savings for efficiency measures, programs and portfolios. Savings lifetime is the length of time that one or more energy efficiency measures or activities save energy, and savings persistence is the change in savings throughout the functional life of a given efficiency measure or activity. Savings lifetimes are essential for assessing the lifecycle benefits and cost effectiveness of efficiency activities and for forecasting loads in resource planning. The brief also provides estimates of savings lifetimes derived from a national collection of costs and savings for electric efficiency programs and portfolios.

  20. Preprocessing and parameterizing bioimpedance spectroscopy measurements by singular value decomposition.

    Science.gov (United States)

    Nejadgholi, Isar; Caytak, Herschel; Bolic, Miodrag; Batkin, Izmail; Shirmohammadi, Shervin

    2015-05-01

    In several applications of bioimpedance spectroscopy, the measured spectrum is parameterized by being fitted into the Cole equation. However, the extracted Cole parameters seem to be inconsistent from one measurement session to another, which leads to a high standard deviation of extracted parameters. This inconsistency is modeled with a source of random variations added to the voltage measurement carried out in the time domain. These random variations may originate from biological variations that are irrelevant to the evidence that we are investigating. Yet, they affect the voltage measured by using a bioimpedance device based on which magnitude and phase of impedance are calculated.By means of simulated data, we showed that Cole parameters are highly affected by this type of variation. We further showed that singular value decomposition (SVD) is an effective tool for parameterizing bioimpedance measurements, which results in more consistent parameters than Cole parameters. We propose to apply SVD as a preprocessing method to reconstruct denoised bioimpedance measurements. In order to evaluate the method, we calculated the relative difference between parameters extracted from noisy and clean simulated bioimpedance spectra. Both mean and standard deviation of this relative difference are shown to effectively decrease when Cole parameters are extracted from preprocessed data in comparison to being extracted from raw measurements.We evaluated the performance of the proposed method in distinguishing three arm positions, for a set of experiments including eight subjects. It is shown that Cole parameters of different positions are not distinguishable when extracted from raw measurements. However, one arm position can be distinguished based on SVD scores. Moreover, all three positions are shown to be distinguished by two parameters, R0/R∞ and Fc, when Cole parameters are extracted from preprocessed measurements. These results suggest that SVD could be considered as an

  1. Optimizing preprocessing and analysis pipelines for single-subject fMRI. I. Standard temporal motion and physiological noise correction methods.

    Science.gov (United States)

    Churchill, Nathan W; Oder, Anita; Abdi, Hervé; Tam, Fred; Lee, Wayne; Thomas, Christopher; Ween, Jon E; Graham, Simon J; Strother, Stephen C

    2012-03-01

    Subject-specific artifacts caused by head motion and physiological noise are major confounds in BOLD fMRI analyses. However, there is little consensus on the optimal choice of data preprocessing steps to minimize these effects. To evaluate the effects of various preprocessing strategies, we present a framework which comprises a combination of (1) nonparametric testing including reproducibility and prediction metrics of the data-driven NPAIRS framework (Strother et al. [2002]: NeuroImage 15:747-771), and (2) intersubject comparison of SPM effects, using DISTATIS (a three-way version of metric multidimensional scaling (Abdi et al. [2009]: NeuroImage 45:89-95). It is shown that the quality of brain activation maps may be significantly limited by sub-optimal choices of data preprocessing steps (or "pipeline") in a clinical task-design, an fMRI adaptation of the widely used Trail-Making Test. The relative importance of motion correction, physiological noise correction, motion parameter regression, and temporal detrending were examined for fMRI data acquired in young, healthy adults. Analysis performance and the quality of activation maps were evaluated based on Penalized Discriminant Analysis (PDA). The relative importance of different preprocessing steps was assessed by (1) a nonparametric Friedman rank test for fixed sets of preprocessing steps, applied to all subjects; and (2) evaluating pipelines chosen specifically for each subject. Results demonstrate that preprocessing choices have significant, but subject-dependant effects, and that individually-optimized pipelines may significantly improve the reproducibility of fMRI results over fixed pipelines. This was demonstrated by the detection of a significant interaction with motion parameter regression and physiological noise correction, even though the range of subject head motion was small across the group (≪ 1 voxel). Optimizing pipelines on an individual-subject basis also revealed brain activation patterns

  2. Contour extraction of echocardiographic images based on pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana [Department of Multimedia, Faculty of Computer Science and Information Technology, Department of Computer and Communication Systems Engineering, Faculty of Engineering University Putra Malaysia 43400 Serdang, Selangor (Malaysia); Zamrin, D M [Department of Surgery, Faculty of Medicine, National University of Malaysia, 56000 Cheras, Kuala Lumpur (Malaysia); Saripan, M Iqbal

    2011-02-15

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  3. Effects of preprocessing Landsat MSS data on derived features

    Science.gov (United States)

    Parris, T. M.; Cicone, R. C.

    1983-01-01

    Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.

  4. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... practice hand hygiene in hospitals and other healthcare facilities. Release Date: 8/4/2010 Source: Healthcare-associated ... choose "Save target as…", save file in desired location. Firefox/Chrome: Right-click on the link, choose " ...

  5. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... to us. Send us a comment about our videos . Learn more about CDC-TV Download Instructions Explorer: Right-click on the link, choose "Save target as…", save file in desired location. Firefox/Chrome: Right-click on the ... CDC-TV CDC-TV videos cover a variety of health, safety and preparedness ...

  6. Polish households’ saving strategies

    Directory of Open Access Journals (Sweden)

    Paulina Anioła

    2013-03-01

    Full Text Available The article presents the results of households’ saving strategies types classification. A cluster analysis method, based on households’ saving portfolio, was used for classification. Six types of strategies were distinguished: low risk, conservative, very passive, very conservative, diversification and aggressive.

  7. Potential energy savings

    DEFF Research Database (Denmark)

    Schultz, Jørgen Munthe

    1996-01-01

    This chapter describes the chosen methods for estimating the potential energy savings if ordinary window glazing is exchanged with aerogel glazing as well as commercial low-energy glazings.......This chapter describes the chosen methods for estimating the potential energy savings if ordinary window glazing is exchanged with aerogel glazing as well as commercial low-energy glazings....

  8. Energy saving report; Energispareredegoerelse

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-05-01

    Making the energy consumption more efficient in Denmark by means of energy saving are connected to the Danish government's overall National Growth Strategy. A reduced consumption of energy as a result of economic attractive energy saving initiatives leads to reduced energy costs and consequently an improvement of Denmark's competitive power. Furthermore, reducing the energy consumption decreases the vulnerability towards rising energy prices, and the security of supplies is increased. Finally energy conservation serves as a means of reducing environmental impact from the energy sector. Energy consumption is not damaging to the environment in itself, however, to the extent that energy consumption causes negative impact on the environment, e.g. through discharge of environmentally damaging substances generated during energy production, a reduction of the energy consumption will benefit the environment. Energy saving in itself does not lead to a decrease in CO{sub 2} emission in Denmark unless it is accompanied by an adjustment of CO{sub 2} quotas on production of electricity. The Danish Government emphasizes that the energy saving efforts are cost-effective both for the society and for the consumers. The energy saving report contains an updated projection of the Danish energy consumption and an evaluation of the impacts of the energy saving efforts. The impacts of more effective end use of energy are described, and an account of the barriers preventing the realization of a number of potential economic attractive energy saving initiatives is made. Finally the energy saving report presents a number of proposals for new energy saving initiatives. (BA)

  9. Data pre-processing for quantification in tomography and radiography with a digital flat panel detector

    Science.gov (United States)

    Rinkel, Jean; Gerfault, Laurent; Estève, François; Dinten, Jean-Marc

    2006-03-01

    In order to obtain accurate quantitative results, flat panel detectors require specific calibration and correction of acquisitions. Main artefacts are due to bad pixels, variations of photodiodes characteristics and inhomogeneity of X-rays sensitivity of the scintillator layer. Other limitations for quantification are the non-linearity of the detector due to charge trapping in the transistors and the scattering generated inside the detector, called detector scattering. Based on physical models of artefacts generation, this paper presents an unified framework for the calibration and correction of these artefacts. The following specific algorithms have been developed to correct them. A new method for correction of deviation to linearity is based on the comparison between experimental and simulated data. A method of detector scattering correction is performed in two steps: off-line characterization of detector scattering by considering its spatial distribution through a convolution model and on-line correction based on a deconvolution approach. Radiographic results on an anthropomorphic thorax phantom imaged with a flat panel detector, that convert X-rays into visible light using scintillator coupled to an amorphous silicon transistor frame for photons to electrons conversion, demonstrate that experimental X-rays attenuation images are significantly improved qualitatively and quantitatively by applying non-linearity correction and detector scattering correction. Results obtained on tomographic reconstructions from pre-processed acquisitions of the phantom are in very good agreement with expected attenuation coefficients values obtained with a multi-slice CT scanner. Thus, this paper demonstrates the efficiency of the proposed pre-processings to perform accurate quantification on radiographies and tomographies.

  10. Simple and Effective Way for Data Preprocessing Selection Based on Design of Experiments.

    Science.gov (United States)

    Gerretzen, Jan; Szymańska, Ewa; Jansen, Jeroen J; Bart, Jacob; van Manen, Henk-Jan; van den Heuvel, Edwin R; Buydens, Lutgarde M C

    2015-12-15

    The selection of optimal preprocessing is among the main bottlenecks in chemometric data analysis. Preprocessing currently is a burden, since a multitude of different preprocessing methods is available for, e.g., baseline correction, smoothing, and alignment, but it is not clear beforehand which method(s) should be used for which data set. The process of preprocessing selection is often limited to trial-and-error and is therefore considered somewhat subjective. In this paper, we present a novel, simple, and effective approach for preprocessing selection. The defining feature of this approach is a design of experiments. On the basis of the design, model performance of a few well-chosen preprocessing methods, and combinations thereof (called strategies) is evaluated. Interpretation of the main effects and interactions subsequently enables the selection of an optimal preprocessing strategy. The presented approach is applied to eight different spectroscopic data sets, covering both calibration and classification challenges. We show that the approach is able to select a preprocessing strategy which improves model performance by at least 50% compared to the raw data; in most cases, it leads to a strategy very close to the true optimum. Our approach makes preprocessing selection fast, insightful, and objective.

  11. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... public health professionals. More > Hand Hygiene Saves Lives (5:10) Recommend on Facebook Tweet Share Compartir Hand ... High resolution [22.9 MB] Open Captioned [14.5 MB] Request a higher resolution file Copy the ...

  12. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Laboratory Science: Mission Critical Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home ... To Health (4:17) Vital Signs High Blood Pressure Spanish Diseases & Conditions Hablemos de la Influenza Influenza ...

  13. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Obesity Epidemic Outbreaks CDC: Protecting Americans through Global Health Ebola and Contact Tracing Global Disease Detectives in ... Science: Mission Critical Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home Smoke- ...

  14. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home Smoke-free Multiunit Housing ... maintained by: Office of the Associate Director for Communication, Division of News and Electronic Media Email Recommend ...

  15. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Center (EOC) 101 Emergency Operations Center CDC Laboratory Science: Mission Critical Saving Lives, Protecting People Environmental Health ... Break the Silence: Stop the Violence Injury Prevention Research In the Swim of Things Safe Teen Drivers ...

  16. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... To Health (4:17) Vital Signs High Blood Pressure Spanish Diseases & Conditions Hablemos de la Influenza Influenza ... Videos are prepared for different audiences including, children, parents, and public health professionals. More > Hand Hygiene Saves ...

  17. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Tricky Treats Hygiene Fight Germs. Wash Your Hands! Go with the Flow Hand Hygiene Saves Lives Wash ... Wes Studi: Signs (:30) Traveler’s Health Way to Go Way to Go: Many Healthy Returns (4:00) ...

  18. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Critical Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home Smoke-free Multiunit ... The Story of Folic Acid Fortification Through the Eyes of the Eagle Wes Studi: Don’t Get ...

  19. Realized Cost Savings 2016

    Data.gov (United States)

    Department of Veterans Affairs — This dataset is provided as a requirement of OMB’s Integrated Data Collection (IDC) and links to VA’s Realized Cost Savings and Avoidances data in JSON format. Cost...

  20. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Saving Lives, Protecting People Environmental Health CDC Tracking Network Health Begins at Home Smoke-free Multiunit Housing ... Email CDC-INFO U.S. Department of Health & Human Services HHS/Open USA.gov Top

  1. The Pre-Processing of Images Technique for the Materia

    Directory of Open Access Journals (Sweden)

    Yevgeniy P. Putyatin

    2016-08-01

    Full Text Available The image processing analysis is one of the most powerful tool in various research fields, especially in material / polymer science. Therefore in the present article an attempt has been made for study of pre-processing of images technique of the material samples during the images taken out by Scanning Electron Microscope (SEM. First we prepared the material samples with coir fibre (natural and its polymer composite after that the image analysis has been performed by SEM technique and later on the said studies have been conducted. The results presented here were found satisfactory and also are in good agreement with our earlier work and some other worker in the same field.

  2. A Gender Recognition Approach with an Embedded Preprocessing

    Directory of Open Access Journals (Sweden)

    Md. Mostafijur Rahman

    2015-05-01

    Full Text Available Gender recognition from facial images has become an empirical aspect in present world. It is one of the main problems of computer vision and researches have been conducting on it. Though several techniques have been proposed, most of the techniques focused on facial images in controlled situation. But the problem arises when the classification is performed in uncontrolled conditions like high rate of noise, lack of illumination, etc. To overcome these problems, we propose a new gender recognition framework which first preprocess and enhances the input images using Adaptive Gama Correction with Weighting Distribution. We used Labeled Faces in the Wild (LFW database for our experimental purpose which contains real life images of uncontrolled condition. For measuring the performance of our proposed method, we have used confusion matrix, precision, recall, F-measure, True Positive Rate (TPR, and False Positive Rate (FPR. In every case, our proposed framework performs superior over other existing state-of-the-art techniques.

  3. Constant-overhead secure computation of Boolean circuits using preprocessing

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Zakarias, S.

    2013-01-01

    We present a protocol for securely computing a Boolean circuit C in presence of a dishonest and malicious majority. The protocol is unconditionally secure, assuming a preprocessing functionality that is not given the inputs. For a large number of players the work for each player is the same...... as computing the circuit in the clear, up to a constant factor. Our protocol is the first to obtain these properties for Boolean circuits. On the technical side, we develop new homomorphic authentication schemes based on asymptotically good codes with an additional multiplication property. We also show a new...... algorithm for verifying the product of Boolean matrices in quadratic time with exponentially small error probability, where previous methods only achieved constant error....

  4. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  5. Constant-Overhead Secure Computation of Boolean Circuits using Preprocessing

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Zakarias, Sarah Nouhad Haddad

    We present a protocol for securely computing a Boolean circuit $C$ in presence of a dishonest and malicious majority. The protocol is unconditionally secure, assuming access to a preprocessing functionality that is not given the inputs to compute on. For a large number of players the work done...... by each player is the same as the work needed to compute the circuit in the clear, up to a constant factor. Our protocol is the first to obtain these properties for Boolean circuits. On the technical side, we develop new homomorphic authentication schemes based on asymptotically good codes...... with an additional multiplication property. We also show a new algorithm for verifying the product of Boolean matrices in quadratic time with exponentially small error probability, where previous methods would only give a constant error....

  6. Digital soil mapping: strategy for data pre-processing

    Directory of Open Access Journals (Sweden)

    Alexandre ten Caten

    2012-08-01

    Full Text Available The region of greatest variability on soil maps is along the edge of their polygons, causing disagreement among pedologists about the appropriate description of soil classes at these locations. The objective of this work was to propose a strategy for data pre-processing applied to digital soil mapping (DSM. Soil polygons on a training map were shrunk by 100 and 160 m. This strategy prevented the use of covariates located near the edge of the soil classes for the Decision Tree (DT models. Three DT models derived from eight predictive covariates, related to relief and organism factors sampled on the original polygons of a soil map and on polygons shrunk by 100 and 160 m were used to predict soil classes. The DT model derived from observations 160 m away from the edge of the polygons on the original map is less complex and has a better predictive performance.

  7. Real-Time Rendering of Teeth with No Preprocessing

    DEFF Research Database (Denmark)

    Larsen, Christian Thode; Frisvad, Jeppe Revall; Jensen, Peter Dahl Ejby

    2012-01-01

    We present a technique for real-time rendering of teeth with no need for computational or artistic preprocessing. Teeth constitute a translucent material consisting of several layers; a highly scattering material (dentine) beneath a semitransparent layer (enamel) with a transparent coating (saliva......). In this study we examine how light interacts with this multilayered structure. In the past, rendering of teeth has mostly been done using image-based texturing or volumetric scans. We work with surface scans and have therefore developed a simple way of estimating layer thicknesses. We use scattering properties...... based on measurements reported in the optics literature, and we compare rendered results qualitatively to images of ceramic teeth created by denturists....

  8. Sparse and Unique Nonnegative Matrix Factorization Through Data Preprocessing

    CERN Document Server

    Gillis, Nicolas

    2012-01-01

    Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different but equivalent factorizations. In this paper, we introduce a completely new way to obtaining more well-posed NMF problems whose solutions are sparser. Our technique is based on the preprocessing of the nonnegative input data matrix, and relies on the theory of M-matrices and the geometric interpretation of NMF. This approach provably leads to optimal and sparse solutions under the separability assumption of Donoho and Stodden (NIPS, 2003), and, for rank-three matrices, makes the number of exact factorizations finite. We illustrate the effectiveness of our technique on several image datasets.

  9. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  10. Preprocessing in a Tiered Sensor Network for Habitat Monitoring

    Directory of Open Access Journals (Sweden)

    Hanbiao Wang

    2003-03-01

    Full Text Available We investigate task decomposition and collaboration in a two-tiered sensor network for habitat monitoring. The system recognizes and localizes a specified type of birdcalls. The system has a few powerful macronodes in the first tier, and many less powerful micronodes in the second tier. Each macronode combines data collected by multiple micronodes for target classification and localization. We describe two types of lightweight preprocessing which significantly reduce data transmission from micronodes to macronodes. Micronodes classify events according to their cross-zero rates and discard irrelevant events. Data about events of interest is reduced and compressed before being transmitted to macronodes for target localization. Preliminary experiments illustrate the effectiveness of event filtering and data reduction at micronodes.

  11. Data acquisition and preprocessing techniques for remote sensing field research

    Science.gov (United States)

    Biehl, L. L.; Robinson, B. F.

    1983-01-01

    A crops and soils data base has been developed at Purdue University's Laboratory for Applications of Remote Sensing using spectral and agronomic measurements made by several government and university researchers. The data are being used to (1) quantitatively determine the relationships of spectral and agronomic characteristics of crops and soils, (2) define future sensor systems, and (3) develop advanced data analysis techniques. Researchers follow defined data acquisition and preprocessing techniques to provide fully annotated and calibrated sets of spectral, agronomic, and meteorological data. These procedures enable the researcher to combine his data with that acquired by other researchers for remote sensing research. The key elements or requirements for developing a field research data base of spectral data that can be transported across sites and years are appropriate experiment design, accurate spectral data calibration, defined field procedures, and through experiment documentation.

  12. Radar image preprocessing. [of SEASAT-A SAR data

    Science.gov (United States)

    Frost, V. S.; Stiles, J. A.; Holtzman, J. C.; Held, D. N.

    1980-01-01

    Standard image processing techniques are not applicable to radar images because of the coherent nature of the sensor. Therefore there is a need to develop preprocessing techniques for radar images which will then allow these standard methods to be applied. A random field model for radar image data is developed. This model describes the image data as the result of a multiplicative-convolved process. Standard techniques, those based on additive noise and homomorphic processing are not directly applicable to this class of sensor data. Therefore, a minimum mean square error (MMSE) filter was designed to treat this class of sensor data. The resulting filter was implemented in an adaptive format to account for changes in local statistics and edges. A radar image processing technique which provides the MMSE estimate inside homogeneous areas and tends to preserve edge structure was the result of this study. Digitally correlated Seasat-A synthetic aperture radar (SAR) imagery was used to test the technique.

  13. Preprocessing Solar Images while Preserving their Latent Structure

    CERN Document Server

    Stein, Nathan M; Kashyap, Vinay L

    2015-01-01

    Telescopes such as the Atmospheric Imaging Assembly aboard the Solar Dynamics Observatory, a NASA satellite, collect massive streams of high resolution images of the Sun through multiple wavelength filters. Reconstructing pixel-by-pixel thermal properties based on these images can be framed as an ill-posed inverse problem with Poisson noise, but this reconstruction is computationally expensive and there is disagreement among researchers about what regularization or prior assumptions are most appropriate. This article presents an image segmentation framework for preprocessing such images in order to reduce the data volume while preserving as much thermal information as possible for later downstream analyses. The resulting segmented images reflect thermal properties but do not depend on solving the ill-posed inverse problem. This allows users to avoid the Poisson inverse problem altogether or to tackle it on each of $\\sim$10 segments rather than on each of $\\sim$10$^7$ pixels, reducing computing time by a facto...

  14. Prediction of speech intelligibility based on an auditory preprocessing model

    DEFF Research Database (Denmark)

    Christiansen, Claus Forup Corlin; Pedersen, Michael Syskind; Dau, Torsten

    2010-01-01

    Classical speech intelligibility models, such as the speech transmission index (STI) and the speech intelligibility index (SII) are based on calculations on the physical acoustic signals. The present study predicts speech intelligibility by combining a psychoacoustically validated model of auditory...... preprocessing [Dau et al., 1997. J. Acoust. Soc. Am. 102, 2892-2905] with a simple central stage that describes the similarity of the test signal with the corresponding reference signal at a level of the internal representation of the signals. The model was compared with previous approaches, whereby a speech...... in noise experiment was used for training and an ideal binary mask experiment was used for evaluation. All three models were able to capture the trends in the speech in noise training data well, but the proposed model provides a better prediction of the binary mask test data, particularly when the binary...

  15. The Technology Steps of Saving Electricity of the lighting System in Intelligent Buildings%智能大厦照明系统的节能技术措施

    Institute of Scientific and Technical Information of China (English)

    戴瑜兴; 汪鲁才

    2000-01-01

    Based on the requirements of the lighting system in theintelligent building, this paper discuses the technology steps ofsaving electricity of the lighting system in the intelligent buildingfrom the aspects such as: electricity、 light、 lamp、 the selection ofthe lighting control equipments、 illuminance standard、 the decisionof lighting mode and the arrangement of lighting circuit.%针对智能大厦照明系统的基本要求,结合电光源、灯具、照明控制设备的选择,照度标准、照明方式的确定,以及照明线路的布置和照明控制系统,论述智能大厦的节能技术措施。

  16. A new approach to pre-processing digital image for wavelet-based watermark

    Science.gov (United States)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  17. Saving, dependency and development.

    Science.gov (United States)

    Kelley, A C; Schmidt, R M

    1996-01-01

    This study examines the impact of dependency on savings between 65 less developed countries (LDCs) and 23 developed countries over time and cross-sectionally since 1960. The study tests a modified Leff model and the Mason life-cycle framework. Empirical estimates address potential simultaneity between savings and output growth. The price indices of Summers and Heston are used because each country's national accounts are converted from nominal into purchasing-power variables. This eliminates the problems with using exchange rates which vary systematically by level of development with a "true" index of purchasing power. Savings (S/Y) is the percentage share of gross national saving in gross domestic product. Ygr is the growth of per capita income. Y/N gr is the growth in the per capita gross domestic product. Analysis is based on ordinary least squares (OLS) and two-stage least squares techniques, treatment for heteroscedascity, aggregation periods, several definitions of savings, different country samples, and aged dependency and youth dependency. Findings support the Mason variable-growth life-cycle framework that shows that changes in demographic factors accounted for a large part of savings. The relationships in the modified Leff-type model were weak, with the exception of the mildly negative youth and elderly dependency impact in the 1980s. The rate of growth of youth dependency was negative and significant in all cross-sections for the full sample, all panel estimates for both LDCs and the full sample, and in the 1980s for LDCs. In the OLS model, life-cycle effects were weaker, but direct dependency effects were stronger. S/Y over time became slightly more sensitive to changes in life cycle impacts but less sensitive to youth dependency. Demography's impact on savings over time is attributed to the increase in the pace of youth dependency decline and secondarily to its increasing sensitivity to life-cycle effects.

  18. Next Step for STEP

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Claire [CTSI; Bremner, Brenda [CTSI

    2013-08-09

    The Siletz Tribal Energy Program (STEP), housed in the Tribe’s Planning Department, will hire a data entry coordinator to collect, enter, analyze and store all the current and future energy efficiency and renewable energy data pertaining to administrative structures the tribe owns and operates and for homes in which tribal members live. The proposed data entry coordinator will conduct an energy options analysis in collaboration with the rest of the Siletz Tribal Energy Program and Planning Department staff. An energy options analysis will result in a thorough understanding of tribal energy resources and consumption, if energy efficiency and conservation measures being implemented are having the desired effect, analysis of tribal energy loads (current and future energy consumption), and evaluation of local and commercial energy supply options. A literature search will also be conducted. In order to educate additional tribal members about renewable energy, we will send four tribal members to be trained to install and maintain solar panels, solar hot water heaters, wind turbines and/or micro-hydro.

  19. Next Step for STEP

    Energy Technology Data Exchange (ETDEWEB)

    Wood, Claire [CTSI; Bremner, Brenda [CTSI

    2013-08-09

    The Siletz Tribal Energy Program (STEP), housed in the Tribe’s Planning Department, will hire a data entry coordinator to collect, enter, analyze and store all the current and future energy efficiency and renewable energy data pertaining to administrative structures the tribe owns and operates and for homes in which tribal members live. The proposed data entry coordinator will conduct an energy options analysis in collaboration with the rest of the Siletz Tribal Energy Program and Planning Department staff. An energy options analysis will result in a thorough understanding of tribal energy resources and consumption, if energy efficiency and conservation measures being implemented are having the desired effect, analysis of tribal energy loads (current and future energy consumption), and evaluation of local and commercial energy supply options. A literature search will also be conducted. In order to educate additional tribal members about renewable energy, we will send four tribal members to be trained to install and maintain solar panels, solar hot water heaters, wind turbines and/or micro-hydro.

  20. Performance of Pre-processing Schemes with Imperfect Channel State Information

    DEFF Research Database (Denmark)

    Christensen, Søren Skovgaard; Kyritsi, Persa; De Carvalho, Elisabeth

    2006-01-01

    Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER and the high...

  1. Ensemble preprocessing of near-infrared (NIR) spectra for multivariate calibration.

    Science.gov (United States)

    Xu, Lu; Zhou, Yan-Ping; Tang, Li-Juan; Wu, Hai-Long; Jiang, Jian-Hui; Shen, Guo-Li; Yu, Ru-Qin

    2008-06-01

    Preprocessing of raw near-infrared (NIR) spectral data is indispensable in multivariate calibration when the measured spectra are subject to significant noises, baselines and other undesirable factors. However, due to the lack of sufficient prior information and an incomplete knowledge of the raw data, NIR spectra preprocessing in multivariate calibration is still trial and error. How to select a proper method depends largely on both the nature of the data and the expertise and experience of the practitioners. This might limit the applications of multivariate calibration in many fields, where researchers are not very familiar with the characteristics of many preprocessing methods unique in chemometrics and have difficulties to select the most suitable methods. Another problem is many preprocessing methods, when used alone, might degrade the data in certain aspects or lose some useful information while improving certain qualities of the data. In order to tackle these problems, this paper proposes a new concept of data preprocessing, ensemble preprocessing method, where partial least squares (PLSs) models built on differently preprocessed data are combined by Monte Carlo cross validation (MCCV) stacked regression. Little or no prior information of the data and expertise are required. Moreover, fusion of complementary information obtained by different preprocessing methods often leads to a more stable and accurate calibration model. The investigation of two real data sets has demonstrated the advantages of the proposed method.

  2. Automatic selection of preprocessing methods for improving predictions on mass spectrometry protein profiles.

    Science.gov (United States)

    Pelikan, Richard C; Hauskrecht, Milos

    2010-11-13

    Mass spectrometry proteomic profiling has potential to be a useful clinical screening tool. One obstacle is providing a standardized method for preprocessing the noisy raw data. We have developed a system for automatically determining a set of preprocessing methods among several candidates. Our system's automated nature relieves the analyst of the need to be knowledgeable about which methods to use on any given dataset. Each stage of preprocessing is approached with many competing methods. We introduce metrics which are used to balance each method's attempts to correct noise versus preserving valuable discriminative information. We demonstrate the benefit of our preprocessing system on several SELDI and MALDI mass spectrometry datasets. Downstream classification is improved when using our system to preprocess the data.

  3. Automatic Preprocessing of Tidal Gravity Observation Data%重力固体潮观测数据的自动化预处理

    Institute of Scientific and Technical Information of China (English)

    许闯; 罗志才; 林旭; 周波阳

    2013-01-01

    The preprocessing of tidal gravity observation data is very important to obtain high-quality tidal harmonic analysis results. The preprocessing methods of tidal gravity observation data are studied systematically, and average filtering method and wavelet filtering method for downsampling original tidal gravity observation data are given in the paper, as well as the linear interpolation method and the cubic spline interpolation method for processing interrupt data. The automatic preprocessing software of the tidal gravity observation data (APTsoft) is developed, which can calibrate and correct automatically abnormal data such as spikes, steps and interrupts. Finally, the experimental results show that the preprocessing methods and APTsoft are very effective, and APTsoft can be applied to the automatic preprocessing of tidal gravity observation data.%研究了重力固体潮汐观测数据的预处理方法,给出了对原始观测数据降采样的平均滤波和小波滤波处理方法以及处理中断数据的线性插值和三次样条插值方法,研制了重力固体潮汐观测数据自动化预处理软件APTsoft,实现了异常数据(包括尖峰、台阶、中断等)的自动标定与改正功能.实验结果验证了本文预处理方法及APTsoft软件的有效性,APTsoft可应用于重力固体潮观测数据的自动化预处理.

  4. Joint Preprocesser-Based Detectors for One-Way and Two-Way Cooperative Communication Networks

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-05-01

    Efficient receiver designs for cooperative communication networks are becoming increasingly important. In previous work, cooperative networks communicated with the use of L relays. As the receiver is constrained, channel shortening and reduced-rank techniques were employed to design the preprocessing matrix that reduces the length of the received vector from L to U. In the first part of the work, a receiver structure is proposed which combines our proposed threshold selection criteria with the joint iterative optimization (JIO) algorithm that is based on the mean square error (MSE). Our receiver assists in determining the optimal U. Furthermore, this receiver provides the freedom to choose U for each frame depending on the tolerable difference allowed for MSE. Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings while having no or minimal effect on the BER performance of the system. Furthermore, the effect of channel estimation on the performance of the cooperative system is investigated. In the second part of the work, a joint preprocessor-based detector for cooperative communication networks is proposed for one-way and two-way relaying. This joint preprocessor-based detector operates on the principles of minimizing the symbol error rate (SER) instead of minimizing MSE. For a realistic assessment, pilot symbols are used to estimate the channel. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Finally, our proposed scheme has the lowest computational complexity.

  5. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  6. Breast image pre-processing for mammographic tissue segmentation.

    Science.gov (United States)

    He, Wenda; Hogg, Peter; Juette, Arne; Denton, Erika R E; Zwiggelaar, Reyer

    2015-12-01

    During mammographic image acquisition, a compression paddle is used to even the breast thickness in order to obtain optimal image quality. Clinical observation has indicated that some mammograms may exhibit abrupt intensity change and low visibility of tissue structures in the breast peripheral areas. Such appearance discrepancies can affect image interpretation and may not be desirable for computer aided mammography, leading to incorrect diagnosis and/or detection which can have a negative impact on sensitivity and specificity of screening mammography. This paper describes a novel mammographic image pre-processing method to improve image quality for analysis. An image selection process is incorporated to better target problematic images. The processed images show improved mammographic appearances not only in the breast periphery but also across the mammograms. Mammographic segmentation and risk/density classification were performed to facilitate a quantitative and qualitative evaluation. When using the processed images, the results indicated more anatomically correct segmentation in tissue specific areas, and subsequently better classification accuracies were achieved. Visual assessments were conducted in a clinical environment to determine the quality of the processed images and the resultant segmentation. The developed method has shown promising results. It is expected to be useful in early breast cancer detection, risk-stratified screening, and aiding radiologists in the process of decision making prior to surgery and/or treatment.

  7. Adaptive preprocessing algorithms of corneal topography in polar coordinate system

    Institute of Scientific and Technical Information of China (English)

    郭雁文

    2014-01-01

    New adaptive preprocessing algorithms based on the polar coordinate system were put forward to get high-precision corneal topography calculation results. Adaptive locating algorithms of concentric circle center were created to accurately capture the circle center of original Placido-based image, expand the image into matrix centered around the circle center, and convert the matrix into the polar coordinate system with the circle center as pole. Adaptive image smoothing treatment was followed and the characteristics of useful circles were extracted via horizontal edge detection, based on useful circles presenting approximate horizontal lines while noise signals presenting vertical lines or different angles. Effective combination of different operators of morphology were designed to remedy data loss caused by noise disturbances, get complete image about circle edge detection to satisfy the requests of precise calculation on follow-up parameters. The experimental data show that the algorithms meet the requirements of practical detection with characteristics of less data loss, higher data accuracy and easier availability.

  8. Software for Preprocessing Data from Rocket-Engine Tests

    Science.gov (United States)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  9. Visualisation and pre-processing of peptide microarray data.

    Science.gov (United States)

    Reilly, Marie; Valentini, Davide

    2009-01-01

    The data files produced by digitising peptide microarray images contain detailed information on the location, feature, response parameters and quality of each spot on each array. In this chapter, we will describe how such peptide microarray data can be read into the R statistical package and pre-processed in preparation for subsequent comparative or predictive analysis. We illustrate how the information in the data can be visualised using images and graphical displays that highlight the main features, enabling the quality of the data to be assessed and invalid data points to be identified and excluded. The log-ratio of the foreground to background signal is used as a response index. Negative control responses serve as a reference against which "detectable" responses can be defined, and slides incubated with only buffer and secondary antibody help identify false-positive responses from peptides. For peptides that have a detectable response on at least one subarray, and no false-positive response, we use linear mixed models to remove artefacts due to the arrays and their architecture. The resulting normalized responses provide the input data for further analysis.

  10. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... to us. Send us a comment about our videos . Learn more about CDC-TV Download Instructions Explorer: Right-click on the link, choose "Save ... safety and preparedness topics and include closed-captioning. Videos are prepared for different audiences including, ... > File Formats ...

  11. Saving Malta's music memory

    OpenAIRE

    Sant, Toni

    2013-01-01

    Maltese music is being lost. Along with it Malta loses its culture, way of life, and memories. Dr Toni Sant is trying to change this trend through the Malta Music Memory Project (M3P) http://www.um.edu.mt/think/saving-maltas-music-memory-2/

  12. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... CDC-TV videos cover a variety of health, safety and preparedness topics and include closed-captioning. Videos are prepared for different audiences including, children, parents, and public health professionals. More > Hand Hygiene Saves Lives (5:10) Recommend on Facebook Tweet ...

  13. Saving Outweighs Substituting

    Institute of Scientific and Technical Information of China (English)

    Sophia

    2007-01-01

    @@ Energy crisis has become great challenge to the whole world.As the vehicle population keeps soaring in China,effectie countermeasures should be taken timely to deal with global energy crisis.There are two ways out:one is to substitute and one is to save.

  14. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Matters: Preserving Choice, Protecting Health (4:30) Salt Matters: Preserving Choice, Protecting Health (2:00) Tricky Treats Hygiene Fight Germs. Wash Your Hands! Go with the Flow Hand Hygiene Saves Lives Wash Your Hands Physical Activity Knees Lifted High Making Health Easier: Active ...

  15. Another Step Closer to Artificial Blood

    Science.gov (United States)

    ... news/fullstory_162357.html Another Step Closer to Artificial Blood Synthetic product could save lives on battlefield and ... 5, 2016 SATURDAY, Dec. 3, 2016 (HealthDay News) -- Artificial blood stored as a powder could one day revolutionize ...

  16. 数据挖掘中的数据预处理%Data Preprocessing in Ddta Mining

    Institute of Scientific and Technical Information of China (English)

    刘明吉; 王秀峰; 黄亚楼

    2000-01-01

    Data Mining (DM) is a new hot research point in database area. Because the real-world data is not ideal,it is necessary to do some data preprocessing to meet the requirement of DM algorithms. In this paper,we discuss the procedure of data preprocessing and present the work of data preprocessing in details. We also discuss the methods and technologies used in data preprocessing.

  17. QR码图像预处理方案研究%Research of QR Code Image Preprocessing Scheme

    Institute of Scientific and Technical Information of China (English)

    李筱楠; 郑华; 刘会杰

    2016-01-01

    Image preprocessing is an important step in the process of QR code decoding. In this paper, a practical image preprocessing method for QR code recognition is proposed. Image binarization can reduce process computation, and locate QR code based on its symbol characteristics. Experimental results demonstrate that the proposed approach can overcome the influence from noise,inhomogeneous light and geometric distortion, and thus increase the recognition rate of the QR code.%图像预处理是QR码解码过程中的重要步骤。在传统识别方案基础上,提出一种实用的QR码图像预处理方法,对采集到的图像进行滤波和二值化处理,由位置探测图形定位 QR 码的位置和畸变角度,并通过透视变换矫正图像几何形变。实验结果表明,该方案可以克服 QR 码易受噪声干扰、光照不均和几何失真等影响的问题,显著提高了QR码的识别率。

  18. REMINDER Saved Leave Scheme (SLS) : Transfer of leave to saved leave accounts

    CERN Document Server

    HR Division

    2002-01-01

    Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'*) annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No. 22B) can be transferred to the saved leave account at the end of the leave year (30 September). We remind you that, since last year, unused leave of all those taking part in the saved leave scheme at the closure of the leave-year accounts is transferred automatically to the saved leave account on that date. Therefore, staff members have no administrative steps to take. In addition, the transfer, which eliminates the risk of omitting to request leave transfers and rules out calculation errors in transfer requests, will be clearly shown in the list of leave transactions that can be consulted in EDH from October 2002 onwards. Furthermore, this automatic leave transfer optimizes staff members' chances of benefiting from a saved leave bonus provided that they are still participants in the schem...

  19. Save More Tomorrow: Using Behavioral Economics to Increase Employee Saving.

    Science.gov (United States)

    Thaler, Richard H.; Benartzi, Shlomo

    2004-01-01

    As firms switch from defined-benefit plans to defined-contribution plans, employees bear more responsibility for making decisions about how much to save. The employees who fail to join the plan or who participate at a very low level appear to be saving at less than the predicted life cycle savings rates. Behavioral explanations for this behavior…

  20. Savings, subgoals, and reference points

    Directory of Open Access Journals (Sweden)

    Helen Colby

    2013-01-01

    Full Text Available Decision makers often save money for a specific goal by forgoing discretionary consumption and instead putting the money toward the savings goal. We hypothesized that reference points can be exploited to enhance this type of saving. In two hypothetical scenario studies, subjects made judgments of their likelihood to forgo a small expenditure in order to put the money toward the savings goal. In Experiment 1, judgments were higher if the savings goal was presented as composed of weekly subgoals (e.g., save $60 per week to buy a $180 iPod. Experiment 2 replicated this finding and demonstrated that the subgoal manipulation increased judgments of likelihood to save money only when the money saved from the foregone consumption would allow the decision maker to meet the weekly subgoal exactly (not under or overshoot it. These results suggest a reference point mechanism and point to ways that behavioral decision research can be harnessed to improve economic behaviors.

  1. Saving water through global trade

    NARCIS (Netherlands)

    Chapagain, A.K.; Hoekstra, A.Y.; Savenije, H.H.G.

    2005-01-01

    Many nations save domestic water resources by importing water-intensive products and exporting commodities that are less water intensive. National water saving through the import of a product can imply saving water at a global level if the flow is from sites with high to sites with low water product

  2. Saving water through global trade

    NARCIS (Netherlands)

    Chapagain, Ashok; Hoekstra, Arjen Ysbert; Savenije, H.H.G.

    2005-01-01

    Many nations save domestic water resources by importing water-intensive products and exporting commodities that are less water intensive. National water saving through the import of a product can imply saving water at a global level if the flow is from sites with high to sites with low water

  3. Social Capital and Savings Behavior

    DEFF Research Database (Denmark)

    Newman, Carol; Tarp, Finn; Khai, Luu Duc

    In this paper, we analyze household savings in rural Vietnam paying particular attention to the factors that determine the proportion of savings held as formal deposits. Our aim is to explore the extent to which social capital can play a role in promoting formal savings behavior. Social capital...

  4. The Effects of Pre-processing Strategies for Pediatric Cochlear Implant Recipients

    Science.gov (United States)

    Rakszawski, Bernadette; Wright, Rose; Cadieux, Jamie H.; Davidson, Lisa S.; Brenner, Christine

    2016-01-01

    Background Cochlear implants (CIs) have been shown to improve children’s speech recognition over traditional amplification when severe to profound sensorineural hearing loss is present. Despite improvements, understanding speech at low-level intensities or in the presence of background noise remains difficult. In an effort to improve speech understanding in challenging environments, Cochlear Ltd. offers pre-processing strategies that apply various algorithms prior to mapping the signal to the internal array. Two of these strategies include Autosensitivity Control™ (ASC) and Adaptive Dynamic Range Optimization (ADRO®). Based on previous research, the manufacturer’s default pre-processing strategy for pediatrics’ everyday programs combines ASC+ADRO®. Purpose The purpose of this study is to compare pediatric speech perception performance across various pre-processing strategies while applying a specific programming protocol utilizing increased threshold (T) levels to ensure access to very low-level sounds. Research Design This was a prospective, cross-sectional, observational study. Participants completed speech perception tasks in four pre-processing conditions: no pre-processing, ADRO®, ASC, ASC+ADRO®. Study Sample Eleven pediatric Cochlear Ltd. cochlear implant users were recruited: six bilateral, one unilateral, and four bimodal. Intervention Four programs, with the participants’ everyday map, were loaded into the processor with different pre-processing strategies applied in each of the four positions: no pre-processing, ADRO®, ASC, and ASC+ADRO®. Data Collection and Analysis Participants repeated CNC words presented at 50 and 70 dB SPL in quiet and HINT sentences presented adaptively with competing R-Space noise at 60 and 70 dB SPL. Each measure was completed as participants listened with each of the four pre-processing strategies listed above. Test order and condition were randomized. A repeated-measures analysis of variance (ANOVA) was used to

  5. Locomotive energy savings possibilities

    Directory of Open Access Journals (Sweden)

    Leonas Povilas LINGAITIS

    2009-01-01

    Full Text Available Economic indicators of electrodynamic braking have not been properly estimated. Vehicles with alternative power trains are transitional stage between development of pollution- free vehicles. According to these aspects the investigation on conventional hybrids drives and their control system is carried out in the article. The equation that allows evaluating effectiveness of regenerative braking for different variants of hybrid drive are given. Presenting different types of locomotive energy savings power systems, which are using regenerative braking energy any form of hybrid traction vehicles systems, circuit diagrams, electrical parameters curves.

  6. Water Saving for Development

    Science.gov (United States)

    Zacharias, Ierotheos

    2013-04-01

    The project "Water Saving for Development (WaS4D)" is financed by European Territorial Cooperational Programme, Greece-Italy 2007-2013, and aims at developing issues on water saving related to improvement of individual behaviors and implementing innovative actions and facilities in order to harmonize policies and start concrete actions for a sustainable water management, making also people and stakeholders awake to water as a vital resource, strategic for quality of life and territory competitiveness. Drinkable water saving culture & behavior, limited water resources, water supply optimization, water resources and demand management, water e-service & educational e-tools are the key words of WaS4D. In this frame the project objectives are: • Definition of water need for domestic and other than domestic purposes: regional and territorial hydro-balance; • promotion of locally available resources not currently being used - water recycling or reuse and rainwater harvesting; • scientific data implementation into Informative Territorial System and publication of geo-referred maps into the institutional web sites, to share information for water protection; • participated review of the regulatory framework for the promotion of water-efficient devices and practices by means of the definition of Action Plans, with defined targets up to brief (2015) and medium (2020) term; • building up water e-services, front-office for all the water issues in building agricultural, industrial and touristic sectors, to share information, procedures and instruments for the water management; • creation and publication of a user friendly software, a game, to promote sustainability for houses also addressed to young people; • creation of water info point into physical spaces called "Water House" to promote education, training, events and new advisory services to assist professionals involved in water uses and consumers; • implementation of participatory approach & networking for a

  7. Saving Woodcut Culture

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Organizations across China have come together to save an ancient New Year tradition"Door god,door god,rides the red horse,stands by the door and watches the house;door god,door god, carries a knife,keeps the ghosts away from the room."The"door god"in this popular Spring Festival rhyme is the main theme of China’s woodcut New Year art,a tradition that has been passed down for generations. As the Spring Festival,also known as

  8. Learning to save lives!

    CERN Document Server

    2003-01-01

    They're all around you and watch over you, but you won't be aware of them unless you look closely at their office doors. There are 308 of them and they have all been given 12 hours of training with the CERN Fire Brigade. Who are they? Quite simply, those who could one day save your life at work, the CERN first-aiders. First-aiders are recruited on a volunteer basis. "Training is in groups of 10 to 12 people and a lot of emphasis is placed on the practical to ensure that they remember the life-saving techniques we show them", explains Patrick Berlinghi, a CERN first-aid instructor from TIS Division. He is looking forward to the arrival of four new instructors, which will bring the total number to twelve (eleven firemen and one member of the Medical Service). "The new instructors were trained at CERN from 16 to 24 May by Marie-Christine Boucher Da Ros (a member of the Commission Pédagogie de l'Observatoire National Français du Secourisme, the education commission of France's national first-aid body). This in...

  9. The Research of License Plate Image Preprocessing Method Base on VC++%基于VC++车牌图像预处理方法研究

    Institute of Scientific and Technical Information of China (English)

    李德峰; 丁玉飞; 邱细亚

    2011-01-01

    车牌图像识别的预处理是车牌图像识别系统的重要环节之一。该文简要地介绍车牌图像受环境因素影响所呈现的特征后,系统地阐述了车牌识别系统中图像预处理的各个步骤,包括图像的灰度化、中值滤波、灰度拉伸、sobel算子梯度锐化、二值化等。提出了一种图像预处理方案.并运用VC++编程开发的软件验证了各阶段的实验结果,证实了这种方案对图像的预处理可以达到较好的处理效果,%The license plate image preprocessing is one component of the license Plate recognition system. This paper systematically describes each step of the image preprocessing in the license plate recognition system after describing briefly the characteristics of the license plate images affected by environmental factors,including gray-scale image,median filtering,gray stretch,sobel operator image sharpening,license plate title correction and so on.Proposing an image preprocessing program and developing the Software using VC++ program,which verified the experimental results of the various stages that confirmed the image preprocessing program can achieve better processing results.

  10. Parallelized LEDAPS method for Remote Sensing Preprocessing Based on MPI

    Institute of Scientific and Technical Information of China (English)

    Xionghua; CHEN; Xu; ZHANG; Ying; GUO; Yong; MA; Yanchen; YANG

    2013-01-01

    Based on Landsat image,the Landsat Ecosystem Disturbance Adaptive Processing System(LEDAPS)uses radiation change detection method for image processing and offers the surface reflectivity products for ecosystem carbon sequestration and carbon reserves.As the accumulation of massive remote sensing data especially for the Landsat image,the traditional serial LEDAPS for image processing has a long cycle that make a lot of difficulties in practical application.For this problem,this paper design a high performance parallel LEDAPS processing method based on MPI.The results not only aimed to improve the calculation speed and save computing time,but also considered the load balance between the flexibly extended computing nodes.Results show that the highest speed ratio of parallelized LEDAPS reached 7.37 when the number of MPI process is 8.It effectively improves the ability of LEDAPS to handle massive remote sensing data and reduces the forest carbon stocks calculation cycle by using the remote sensing images.

  11. Examination of Speed Contribution of Parallelization for Several Fingerprint Pre-Processing Algorithms

    Directory of Open Access Journals (Sweden)

    GORGUNOGLU, S.

    2014-05-01

    Full Text Available In analysis of minutiae based fingerprint systems, fingerprints needs to be pre-processed. The pre-processing is carried out to enhance the quality of the fingerprint and to obtain more accurate minutiae points. Reducing the pre-processing time is important for identification and verification in real time systems and especially for databases holding large fingerprints information. Parallel processing and parallel CPU computing can be considered as distribution of processes over multi core processor. This is done by using parallel programming techniques. Reducing the execution time is the main objective in parallel processing. In this study, pre-processing of minutiae based fingerprint system is implemented by parallel processing on multi core computers using OpenMP and on graphics processor using CUDA to improve execution time. The execution times and speedup ratios are compared with the one that of single core processor. The results show that by using parallel processing, execution time is substantially improved. The improvement ratios obtained for different pre-processing algorithms allowed us to make suggestions on the more suitable approaches for parallelization.

  12. Reinforcement Learning and Savings Behavior

    OpenAIRE

    Laibson, David I.; Choi, James J.; Madrian, Brigitte; Metrick, Andrew

    2007-01-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)—a high average and/or low variance return—increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-...

  13. Reinforcement Learning and Savings Behavior.

    Science.gov (United States)

    Choi, James J; Laibson, David; Madrian, Brigitte C; Metrick, Andrew

    2009-12-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)-a high average and/or low variance return-increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-varying investor-level heterogeneity that is correlated with portfolio allocations to stock, bond, and cash asset classes. We discuss implications for the equity premium puzzle and interventions aimed at improving household financial outcomes.

  14. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Swim of Things Safe Teen Drivers Life Stages & Populations A Killer in Indian Country Baby Steps: Learn ... un Bebé? ¿Yo? ¿Tener otro Bebé? Life Stages & Populations Hacer Hábitos Saludables Más Fácil en Guarderías Infantiles ...

  15. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... the Swim of Things Safe Teen Drivers Life Stages & Populations A Killer in Indian Country Baby Steps: ... Tener un Bebé? ¿Yo? ¿Tener otro Bebé? Life Stages & Populations Hacer Hábitos Saludables Más Fácil en Guarderías ...

  16. Hand Hygiene Saves Lives

    Medline Plus

    Full Text Available ... Swim of Things Safe Teen Drivers Life Stages & Populations A Killer in Indian Country Baby Steps: Learn ... un Bebé? ¿Yo? ¿Tener otro Bebé? Life Stages & Populations Hacer Hábitos Saludables Más Fácil en Guarderías Infantiles ...

  17. Study on preprocessing of surface defect images of cold steel strip

    Directory of Open Access Journals (Sweden)

    Xiaoye GE

    2016-06-01

    Full Text Available The image preprocessing is an important part in the field of digital image processing, and it’s also the premise for the image detection of cold steel strip surface defects. The factors including the complicated on-site environment and the distortion of the optical system will cause image degradation, which will directly affects the feature extraction and classification of the images. Aiming at these problems, a method combining the adaptive median filter and homomorphic filter is proposed to preprocess the image. The adaptive median filter is effective for image denoising, and the Gaussian homomorphic filter can steadily remove the nonuniform illumination of images. Finally, the original and preprocessed images and their features are analyzed and compared. The results show that this method can improve the image quality effectively.

  18. Prescription Program Provides Significant Savings

    Science.gov (United States)

    Rowan, James M.

    2010-01-01

    Most school districts today are looking for ways to save money without decreasing services to its staff. Retired pharmacist Tim Sylvester, a lifelong resident of Alpena Public Schools in Alpena, Michigan, presented the district with a pharmaceuticals plan that would save the district money without raising employee co-pays for prescriptions. The…

  19. Social Capital and Savings Behaviour

    DEFF Research Database (Denmark)

    Newman, Carol; Tarp, Finn; Van Den Broeck, Katleen

    organizations increases the proportion of liquid assets held in the form of deposits that yield a return. Our results imply that targeting information on the benefits of deposit saving through formal networks or groups would be effective in increasing the number of households that save at grassroots level....

  20. Saving Money Through Energy Conservation.

    Science.gov (United States)

    Presley, Michael H.; And Others

    This publication is an introduction to personal energy conservation. The first chapter presents a rationale for conserving energy and points out that private citizens control about one third of this country's energy consumption. Chapters two and three show how to save money by saving energy. Chapter two discusses energy conservation methods in the…

  1. Risk and Savings: a Taxonomy

    NARCIS (Netherlands)

    Gunning, Jan Willem

    2008-01-01

    Risk may induce precautionary saving but it can also reduce saving. The theoretical literature recognizes both possibilities, but favors a positive effect (both for developed and developing countries); the empirical literature is divided, reporting (small) positive effects for developed economies an

  2. Government Policy, Saving and Investment.

    Science.gov (United States)

    Eisner, Robert

    1983-01-01

    Several arguments that government policy--income redistribution and support of the poor, higher marginal income taxes, and social security--has depressed saving are found wanting. Also hard to sustain is the argument that investment demand has been depressed by tax policy. Current government policy will not improve saving and investment. (RM)

  3. Parallelizing flow-accumulation calculations on graphics processing units—From iterative DEM preprocessing algorithm to recursive multiple-flow-direction algorithm

    Science.gov (United States)

    Qin, Cheng-Zhi; Zhan, Lijun

    2012-06-01

    As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU

  4. Boosting model performance and interpretation by entangling preprocessing selection and variable selection.

    Science.gov (United States)

    Gerretzen, Jan; Szymańska, Ewa; Bart, Jacob; Davies, Antony N; van Manen, Henk-Jan; van den Heuvel, Edwin R; Jansen, Jeroen J; Buydens, Lutgarde M C

    2016-09-28

    The aim of data preprocessing is to remove data artifacts-such as a baseline, scatter effects or noise-and to enhance the contextually relevant information. Many preprocessing methods exist to deliver one or more of these benefits, but which method or combination of methods should be used for the specific data being analyzed is difficult to select. Recently, we have shown that a preprocessing selection approach based on Design of Experiments (DoE) enables correct selection of highly appropriate preprocessing strategies within reasonable time frames. In that approach, the focus was solely on improving the predictive performance of the chemometric model. This is, however, only one of the two relevant criteria in modeling: interpretation of the model results can be just as important. Variable selection is often used to achieve such interpretation. Data artifacts, however, may hamper proper variable selection by masking the true relevant variables. The choice of preprocessing therefore has a huge impact on the outcome of variable selection methods and may thus hamper an objective interpretation of the final model. To enhance such objective interpretation, we here integrate variable selection into the preprocessing selection approach that is based on DoE. We show that the entanglement of preprocessing selection and variable selection not only improves the interpretation, but also the predictive performance of the model. This is achieved by analyzing several experimental data sets of which the true relevant variables are available as prior knowledge. We show that a selection of variables is provided that complies more with the true informative variables compared to individual optimization of both model aspects. Importantly, the approach presented in this work is generic. Different types of models (e.g. PCR, PLS, …) can be incorporated into it, as well as different variable selection methods and different preprocessing methods, according to the taste and experience of

  5. Genetic Algorithm for Optimization: Preprocessing with n Dimensional Bisection and Error Estimation

    Science.gov (United States)

    Sen, S. K.; Shaykhian, Gholam Ali

    2006-01-01

    A knowledge of the appropriate values of the parameters of a genetic algorithm (GA) such as the population size, the shrunk search space containing the solution, crossover and mutation probabilities is not available a priori for a general optimization problem. Recommended here is a polynomial-time preprocessing scheme that includes an n-dimensional bisection and that determines the foregoing parameters before deciding upon an appropriate GA for all problems of similar nature and type. Such a preprocessing is not only fast but also enables us to get the global optimal solution and its reasonably narrow error bounds with a high degree of confidence.

  6. Performance of Pre-processing Schemes with Imperfect Channel State Information

    DEFF Research Database (Denmark)

    Christensen, Søren Skovgaard; Kyritsi, Persa; De Carvalho, Elisabeth

    2006-01-01

    Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER and the high...... and the highest SINR when the CSI is perfect, whereas the simple matched filter may be a good choice when the CSI is imperfect. Additionally the results give insight into the inherent trade-off between robustness against CSI imperfections and spatial focusing ability....

  7. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    Science.gov (United States)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  8. Data acquisition, preprocessing and analysis for the Virginia Tech OLYMPUS experiment

    Science.gov (United States)

    Remaklus, P. Will

    1991-01-01

    Virginia Tech is conducting a slant path propagation experiment using the 12, 20, and 30 GHz OLYMPUS beacons. Beacon signal measurements are made using separate terminals for each frequency. In addition, short baseline diversity measurements are collected through a mobile 20 GHz terminal. Data collection is performed with a custom data acquisition and control system. Raw data are preprocessed to remove equipment biases and discontinuities prior to analysis. Preprocessed data are then statistically analyzed to investigate parameters such as frequency scaling, fade slope and duration, and scintillation intensity.

  9. Preprocessing of Tandem Mass Spectrometric Data Based on Decision Tree Classification

    Institute of Scientific and Technical Information of China (English)

    Jing-Fen Zhang; Si-Min He; Jin-Jin Cai; Xing-Jun Cao; Rui-Xiang Sun; Yan Fu; Rong Zeng; Wen Gao

    2005-01-01

    In this study, we present a preprocessing method for quadrupole time-of-flight(Q-TOF) tandem mass spectra to increase the accuracy of database searching for peptide (protein) identification. Based on the natural isotopic information inherent in tandem mass spectra, we construct a decision tree after feature selection to classify the noise and ion peaks in tandem spectra. Furthermore, we recognize overlapping peaks to find the monoisotopic masses of ions for the following identification process. The experimental results show that this preprocessing method increases the search speed and the reliability of peptide identification.

  10. Influence of Hemp Fibers Pre-processing on Low Density Polyethylene Matrix Composites Properties

    Science.gov (United States)

    Kukle, S.; Vidzickis, R.; Zelca, Z.; Belakova, D.; Kajaks, J.

    2016-04-01

    In present research with short hemp fibres reinforced LLDPE matrix composites with fibres content in a range from 30 to 50 wt% subjected to four different pre-processing technologies were produced and such their properties as tensile strength and elongation at break, tensile modulus, melt flow index, micro hardness and water absorption dynamics were investigated. Capillary viscosimetry was used for fluidity evaluation and melt flow index (MFI) evaluated for all variants. MFI of fibres of two pre-processing variants were high enough to increase hemp fibres content from 30 to 50 wt% with moderate increase of water sorption capability.

  11. Saving gas project

    Energy Technology Data Exchange (ETDEWEB)

    Vasques, Maria Anunciacao S. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Garantizado, Maria Auxiliadora G. [CONCREMAT Engenharia, Rio de Janeiro, RJ (Brazil)

    2009-12-19

    The work presented was implemented in municipalities around the construction of the pipeline project Urucu-Coari-Manaus, the Engineering / IETEG-IENOR, because of the constant release of workers, consequently the finishing stages of this work and its future completion. The Project aims to guide saving gas with the workforce, their families and communities to the enterprise of small business cooperatives and solidarity within the potential of the site. This project is developed through the workshops: entrepreneur ship, tourism, use, reuse and recycling of products, and hortifruiti culture, agroecology, agribusiness (cooperativism solidarity) and forestry. Its execution took place in two phases, the first called 'pilot' of 12/12/2007 to 27/03/2008 in sections A and B1, in the municipality of Coari stretch and B2 in Caapiranga. The second phase occurred from 30/06 to 27/09/08, in the words B1, in the municipalities of Codajas and Anori words and B2 in Iranduba, Manacapuru and Anama. The workshops were held in state and municipal schools and administered by the Institute of Social and Environmental Amazon - ISAM, which had a team of coordinators, teachers, experts and masters of the time until the nineteen twenty-two hours to implement the project. (author)

  12. Conversation on data mining strategies in LC-MS untargeted metabolomics: pre-processing and pre-treatment steps

    CSIR Research Space (South Africa)

    Tugizimana, F

    2016-11-01

    Full Text Available .: +27-11-559-2401 Academic Editor: Peter Karp Received: 15 September 2016; Accepted: 27 October 2016; Published: 3 November 2016 Abstract: Untargeted metabolomic studies generate information-rich, high-dimensional, and complex datasets that remain... [3–5]. However, the realization of a holistic coverage of the whole metabolome, in a given biological system, is still currently not feasible (at least with a single method) at the metabolite extraction [6–8] and analytical [2,9,10] levels...

  13. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2016-08-22

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  14. REMINDER Saved Leave Scheme (SLS) : Simplified procedure for the transfer of leave to saved leave accounts

    CERN Multimedia

    HR Division

    2001-01-01

    As part of the process of streamlining procedures, the HR and AS Divisions have jointly developed a system whereby annual and compensatory leave will henceforth be automatically transferred1) to saved leave accounts. Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'2)Previously, every person taking part in the scheme has been individually issued with a form for the purposes of requesting the transfer of leave to the leave account and the transfer has then had to be done manually by HR Division. To streamline the procedure, unused leave of all those taking part in the saved leave scheme at the closure of the leave-year accounts will henceforth be transferred automatically to the saved leave account on that date. This simplification is in the interest of all parties concerned. This automatic transfer procedure has a number of advantages for participants in the SLS scheme. First, staff members will no longer have to take any administrative steps. Secondly, the new proced...

  15. A Real-Time Embedded System for Stereo Vision Preprocessing Using an FPGA

    DEFF Research Database (Denmark)

    Kjær-Nielsen, Anders; Jensen, Lars Baunegaard With; Sørensen, Anders Stengaard

    2008-01-01

    In this paper a low level vision processing node for use in existing IEEE 1394 camera setups is presented. The processing node is a small embedded system, that utilizes an FPGA to perform stereo vision preprocessing at rates limited by the bandwidth of IEEE 1394a (400Mbit). The system is used...

  16. Evaluation of Microarray Preprocessing Algorithms Based on Concordance with RT-PCR in Clinical Samples

    DEFF Research Database (Denmark)

    Hansen, Kasper Lage; Szallasi, Zoltan Imre; Eklund, Aron Charles

    2009-01-01

    evaluated consistency using the Pearson correlation between measurements obtained on the two platforms. Also, we introduce the log-ratio discrepancy as a more relevant measure of discordance between gene expression platforms. Of nine preprocessing algorithms tested, PLIER+16 produced expression values...

  17. Scene matching based on non-linear pre-processing on reference image and sensed image

    Institute of Scientific and Technical Information of China (English)

    Zhong Sheng; Zhang Tianxu; Sang Nong

    2005-01-01

    To solve the heterogeneous image scene matching problem, a non-linear pre-processing method for the original images before intensity-based correlation is proposed. The result shows that the proper matching probability is raised greatly. Especially for the low S/N image pairs, the effect is more remarkable.

  18. A New Endmember Preprocessing Method for the Hyperspectral Unmixing of Imagery Containing Marine Oil Spills

    Directory of Open Access Journals (Sweden)

    Can Cui

    2017-09-01

    Full Text Available The current methods that use hyperspectral remote sensing imagery to extract and monitor marine oil spills are quite popular. However, the automatic extraction of endmembers from hyperspectral imagery remains a challenge. This paper proposes a data field-spectral preprocessing (DSPP algorithm for endmember extraction. The method first derives a set of extreme points from the data field of an image. At the same time, it identifies a set of spectrally pure points in the spectral space. Finally, the preprocessing algorithm fuses the data field with the spectral calculation to generate a new subset of endmember candidates for the following endmember extraction. The processing time is greatly shortened by directly using endmember extraction algorithms. The proposed algorithm provides accurate endmember detection, including the detection of anomalous endmembers. Therefore, it has a greater accuracy, stronger noise resistance, and is less time-consuming. Using both synthetic hyperspectral images and real airborne hyperspectral images, we utilized the proposed preprocessing algorithm in combination with several endmember extraction algorithms to compare the proposed algorithm with the existing endmember extraction preprocessing algorithms. The experimental results show that the proposed method can effectively extract marine oil spill data.

  19. affyPara-a Bioconductor Package for Parallelized Preprocessing Algorithms of Affymetrix Microarray Data.

    Science.gov (United States)

    Schmidberger, Markus; Vicedo, Esmeralda; Mansmann, Ulrich

    2009-07-22

    Microarray data repositories as well as large clinical applications of gene expression allow to analyse several hundreds of microarrays at one time. The preprocessing of large amounts of microarrays is still a challenge. The algorithms are limited by the available computer hardware. For example, building classification or prognostic rules from large microarray sets will be very time consuming. Here, preprocessing has to be a part of the cross-validation and resampling strategy which is necessary to estimate the rule's prediction quality honestly.This paper proposes the new Bioconductor package affyPara for parallelized preprocessing of Affymetrix microarray data. Partition of data can be applied on arrays and parallelization of algorithms is a straightforward consequence. The partition of data and distribution to several nodes solves the main memory problems and accelerates preprocessing by up to the factor 20 for 200 or more arrays.affyPara is a free and open source package, under GPL license, available form the Bioconductor project at www.bioconductor.org. A user guide and examples are provided with the package.

  20. Pre-processing filter design at transmitters for IBI mitigation in an OFDM system

    Institute of Scientific and Technical Information of China (English)

    Xia Wang; Lei Wang

    2013-01-01

    In order to meet the demands for high transmission rates and high service quality in broadband wireless communica-tion systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on ful channel state information (CSI), which improved the system per-formance. A pre-processing filter based on partial CSI at the trans-mitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on ful CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.

  1. Saving energy from waste water

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Pearce

    1999-03-01

    This paper gives details of energy savings from wastewater in the laundry industry by recycling the water from the last rinse to the first wash, and recovering heat from hot water that is too dirty to recycle. The cost savings achieved at the laundry operated by the Royal London Hospital, and improvements in the steam supply system with water from steam traps collected and returned to the boiler house are reported. Case studies are presented involving energy savings in the textile industry where effluent from the washing stage is recycled to the scouring stage, and in the distillery industry involving recovery of heat from hot water for process preheating. (uk)

  2. Saving for Development: How Latin America and the Caribbean Can Save More and Better

    National Research Council Canada - National Science Library

    Inter-American Development Bank; Tomás Serebrisky; Eduardo Cavallo

    2016-01-01

    Why should people - and economies - save? This book on the savings problem in Latin America and the Caribbean suggests that, while saving to survive the bad times is important, saving to thrive in the good times is what really counts...

  3. Inter-Rater Reliability of Preprocessing EEG Data: Impact of Subjective Artifact Removal on Associative Memory Task ERP Results

    Directory of Open Access Journals (Sweden)

    Steven D. Shirk

    2017-06-01

    Full Text Available The processing of EEG data routinely involves subjective removal of artifacts during a preprocessing stage. Preprocessing inter-rater reliability (IRR and how differences in preprocessing may affect outcomes of primary event-related potential (ERP analyses has not been previously assessed. Three raters independently preprocessed EEG data of 16 cognitively healthy adult participants (ages 18–39 years who performed a memory task. Using intraclass correlations (ICCs, IRR was assessed for Early-frontal, Late-frontal, and Parietal Old/new memory effects contrasts across eight regions of interest (ROIs. IRR was good to excellent for all ROIs; 22 of 26 ICCs were above 0.80. Raters were highly consistent in preprocessing across ROIs, although the frontal pole ROI (ICC range 0.60–0.90 showed less consistency. Old/new parietal effects had highest ICCs with the lowest variability. Rater preprocessing differences did not alter primary ERP results. IRR for EEG preprocessing was good to excellent, and subjective rater-removal of EEG artifacts did not alter primary memory-task ERP results. Findings provide preliminary support for robustness of cognitive/memory task-related ERP results against significant inter-rater preprocessing variability and suggest reliability of EEG to assess cognitive-neurophysiological processes multiple preprocessors are involved.

  4. Predictive modeling of colorectal cancer using a dedicated pre-processing pipeline on routine electronic medical records

    NARCIS (Netherlands)

    Kop, Reinier; Hoogendoorn, Mark; Teije, Annette Ten; Büchner, Frederike L; Slottje, Pauline; Moons, Leon M G; Numans, Mattijs E

    2016-01-01

    Over the past years, research utilizing routine care data extracted from Electronic Medical Records (EMRs) has increased tremendously. Yet there are no straightforward, standardized strategies for pre-processing these data. We propose a dedicated medical pre-processing pipeline aimed at taking on

  5. Reproducible cancer biomarker discovery in SELDI-TOF MS using different pre-processing algorithms.

    Directory of Open Access Journals (Sweden)

    Jinfeng Zou

    Full Text Available BACKGROUND: There has been much interest in differentiating diseased and normal samples using biomarkers derived from mass spectrometry (MS studies. However, biomarker identification for specific diseases has been hindered by irreproducibility. Specifically, a peak profile extracted from a dataset for biomarker identification depends on a data pre-processing algorithm. Until now, no widely accepted agreement has been reached. RESULTS: In this paper, we investigated the consistency of biomarker identification using differentially expressed (DE peaks from peak profiles produced by three widely used average spectrum-dependent pre-processing algorithms based on SELDI-TOF MS data for prostate and breast cancers. Our results revealed two important factors that affect the consistency of DE peak identification using different algorithms. One factor is that some DE peaks selected from one peak profile were not detected as peaks in other profiles, and the second factor is that the statistical power of identifying DE peaks in large peak profiles with many peaks may be low due to the large scale of the tests and small number of samples. Furthermore, we demonstrated that the DE peak detection power in large profiles could be improved by the stratified false discovery rate (FDR control approach and that the reproducibility of DE peak detection could thereby be increased. CONCLUSIONS: Comparing and evaluating pre-processing algorithms in terms of reproducibility can elucidate the relationship among different algorithms and also help in selecting a pre-processing algorithm. The DE peaks selected from small peak profiles with few peaks for a dataset tend to be reproducibly detected in large peak profiles, which suggests that a suitable pre-processing algorithm should be able to produce peaks sufficient for identifying useful and reproducible biomarkers.

  6. Data preprocessing methods of FT-NIR spectral data for the classification cooking oil

    Science.gov (United States)

    Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli

    2014-12-01

    This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.

  7. Value of Distributed Preprocessing of Biomass Feedstocks to a Bioenergy Industry

    Energy Technology Data Exchange (ETDEWEB)

    Christopher T Wright

    2006-07-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system and the front-end of a biorefinery. Its purpose is to chop, grind, or otherwise format the biomass into a suitable feedstock for conversion to ethanol and other bioproducts. Many variables such as equipment cost and efficiency, and feedstock moisture content, particle size, bulk density, compressibility, and flowability affect the location and implementation of this unit operation. Previous conceptual designs show this operation to be located at the front-end of the biorefinery. However, data are presented that show distributed preprocessing at the field-side or in a fixed preprocessing facility can provide significant cost benefits by producing a higher value feedstock with improved handling, transporting, and merchandising potential. In addition, data supporting the preferential deconstruction of feedstock materials due to their bio-composite structure identifies the potential for significant improvements in equipment efficiencies and compositional quality upgrades. Theses data are collected from full-scale low and high capacity hammermill grinders with various screen sizes. Multiple feedstock varieties with a range of moisture values were used in the preprocessing tests. The comparative values of the different grinding configurations, feedstock varieties, and moisture levels are assessed through post-grinding analysis of the different particle fractions separated with a medium-scale forage particle separator and a Rototap separator. The results show that distributed preprocessing produces a material that has bulk flowable properties and fractionation benefits that can improve the ease of transporting, handling and conveying the material to the biorefinery and improve the biochemical and thermochemical conversion processes.

  8. How to Save Money on Infant Formula

    Science.gov (United States)

    ... medlineplus.gov/ency/patientinstructions/000805.htm How to Save Money on Infant Formula To use the sharing features ... several months. Here are some ways you can save money on infant formula . Money-Saving Ideas Here are ...

  9. The German SAVE survey: documentation and methodology

    OpenAIRE

    Schunk, Daniel

    2007-01-01

    The purpose of this document is to describe methodological details of the German SAVE survey and to provide users of SAVE with all necessary information for working with the publicly available SAVE dataset.

  10. A comprehensive analysis about the influence of low-level preprocessing techniques on mass spectrometry data for sample classification.

    Science.gov (United States)

    López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Fernández-Riverola, Florentino

    2014-01-01

    Matrix-Assisted Laser Desorption Ionisation Time-of-Flight (MALDI-TOF) is one of the high-throughput mass spectrometry technologies able to produce data requiring an extensive preprocessing before subsequent analyses. In this context, several low-level preprocessing techniques have been successfully developed for different tasks, including baseline correction, smoothing, normalisation, peak detection and peak alignment. In this work, we present a systematic comparison of different software packages aiding in the compulsory preprocessing of MALDI-TOF data. In order to guarantee the validity of our study, we test multiple configurations of each preprocessing technique that are subsequently used to train a set of classifiers whose performance (kappa and accuracy) provide us accurate information for the final comparison. Results from experiments show the real impact of preprocessing techniques on classification, evidencing that MassSpecWavelet provides the best performance and Support Vector Machines (SVM) are one of the most accurate classifiers.

  11. Essay on Saving and Consumption

    Directory of Open Access Journals (Sweden)

    Fabris Nikola

    2015-09-01

    Full Text Available Consumption and saving decisions are at the heart of both short- and long-term macroeconomic analyses. Since the global crisis outbreak, one of the main issues for indebted countries has been whether to pursue a policy which promotes saving or to try to induce economic growth by increasing consumption. Consensus has not been reached on this issue, which is based on an old debate of whether a country should pursue a policy of Keynesianism or monetarism.

  12. A two-step rectification algorithm for airborne linear images with POS data

    Institute of Scientific and Technical Information of China (English)

    TUO Hong-ya; LIU Yun-cai

    2005-01-01

    Rectification for airborne linear images is an indispensable preprocessing step. This paper presents in detail a two-step rectification algorithm. The first step is to establish the model of direct georeference position using the data provided by the Positioning and Orientation System (POS) and obtain the mathematical relationships between the image points and ground reference points. The second step is to apply polynomial distortion model and Bilinear Interpolation to get the final precise rectified images.In this step, a reference image is required and some ground control points (GCPs) are selected. Experiments showed that the final rectified images are satisfactory, and that our two-step rectification algorithm is very effective.

  13. Linear algebra step by step

    CERN Document Server

    Singh, Kuldeep

    2013-01-01

    Linear algebra is a fundamental area of mathematics, and is arguably the most powerful mathematical tool ever developed. It is a core topic of study within fields as diverse as: business, economics, engineering, physics, computer science, ecology, sociology, demography and genetics. For an example of linear algebra at work, one needs to look no further than the Google search engine, which relies upon linear algebra to rank the results of a search with respect to relevance. The strength of the text is in the large number of examples and the step-by-step explanation of each topic as it is introduced. It is compiled in a way that allows distance learning, with explicit solutions to set problems freely available online. The miscellaneous exercises at the end of each chapter comprise questions from past exam papers from various universities, helping to reinforce the reader's confidence. Also included, generally at the beginning of sections, are short historicalbiographies of the leading players in the field of lin...

  14. Influence of data preprocessing on the quantitative determination of nutrient content in poultry manure by near infrared spectroscopy.

    Science.gov (United States)

    Chen, L J; Xing, L; Han, L J

    2010-01-01

    With increasing concern over potential polltion from farm wastes, there is a need for rapid and robust methods that can analyze livestock manure nutrient content. The near infrared spectroscopy (NIRS) method was used to determine nutrient content in diverse poultry manure samples (n=91). Various standard preprocessing methods (derivatives, multiplicative scatter correction, Savitsky-Golay smoothing, and standard normal variate) were applied to reduce data systemic noise. In addition, a new preprocessing method known as direct orthogonal signal correction (DOSC) was tested. Calibration models for ammonium nitrogen, total potassium, total nitrogen, and total phosphorus were developed with the partial least squares (PLS) method. The results showed that all the preprocessed data improved prediction results compared with the non-preprocessing method. Compared with the other preprocessing methods, the DOSC method gave the best results. The DOSC method achieved moderately successful prediction for ammonium nitrogen, total nitrogen, and total phosphorus. However, all preprocessing methods did not provide reliable prediction for total potassium. This indicates the DOSC method, especially combined with other preprocessing methods, needs further study to allow a more complete predictive analysis of manure nutrient content.

  15. 飞机结构应变信号的采集与预处理系统%Strain Data Acquisition and Preprocessing System for Aircraft Structure

    Institute of Scientific and Technical Information of China (English)

    薛军; 纪敦; 李猛; 吴志超

    2009-01-01

    介绍了某型飞机结构疲劳危险部位的机载应变采集与预处理系统的设计与组成.系统以Compact RIO技术构建硬件平台,通过FPGA开发,采取文件细分的思想,运用断点保护的方法,自动完成数据的采集和预处理.系统将应变数据处理成有效峰谷值,填充到频次矩阵中,解决了机载应变采集系统设备存储空间有限的问题.该系统完成了200多个飞行小时的科研试飞,结果表明系统简捷有效.%The design and the construction of in-flight strain data acquisition and preprocessing system are described for an aircraft eritical location.The system hardware platform consists of compact RIO.On the FPGA,the software can automatically complete strain data acquisition and preprocessing by file subdivision and interrupt protection.To save storage space in-flight strain data record system,the software can fill the frequency matrix with all real peaks and valleys from the strain data.Results show that the system is proved to be simple and effective by more than 200 h trial flights.

  16. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  17. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Lara del Val

    2015-06-01

    Full Text Available Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM. The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  18. PRE-Processing for Video Coduing with Rate-Distortion Optimization Decision

    Institute of Scientific and Technical Information of China (English)

    QI Yi; HUANG Yong-gui; QI Hong-gang

    2006-01-01

    This paper proposes an adaptive video pre-processing algorithm for video coding. This algorithm works on the original image before intra- or inter-prediction. It adopts Gaussian filter to remove noise and insignificant features existing in images of video. Detection and restoration of edges are followed to restore the edges which are excessively filtered out in filtered images. Rate-Distortion Optimization (RDO) is employed to decide adaptively whether a processed block or a unprocessed block is coded into bit-streams doe more efficient coding. Our experiment results show that the algorithm achieves good coding performances on both subjective and objective aspects. In addition, the proposed pre-processing algorithm is transparent to decoder, and thus can be compliant with any video coding standards without modifying the decoder.

  19. PREPROCESSING PADA SEGMENTASI CITRA PARU-PARU DAN JANTUNG MENGGUNAKAN ANISOTROPIC DIFFUSION FILTER

    Directory of Open Access Journals (Sweden)

    A. T. A Prawira Kusuma

    2015-12-01

    Full Text Available This paper propose a preprocessing techniques in lung segmentation scheme using Anisotropic Diffusion filters. The aim is to improve the accuracy, sensitivity and specificity results of segmentation. This method was chosen because it has the ability to detect the edge, namely in doing smoothing, this method can obscure noise, while maintaining the edges of objects in the image. Characteristics such as this is needed to process medical image filter, where the boundary between the organ and the background is not so clear. The segmentation process is done by K-means Clustering and Active Contour to segment the lungs. Segmentation results were validated using the Receiver Operating Characteristic (ROC showed an increased accuracy, sensitivity and specificity, when compared with the results of segmentation in the previous paper, in which the preprocessing method used is Gaussian Lowpass filter.

  20. Analog preprocessing in a SNS 2 micrometers low-noise CMOS folding ADC

    Science.gov (United States)

    Carr, Richard D.

    1994-12-01

    Significant research in high performance analog-to-digital converters (ADC's) has been directed at retaining part of the high-speed flash ADC architecture, while reducing the total number of comparators in the circuit. The symmetrical number system (SNS) can be used to preprocess the analog input signal, reducing the number of comparators and thus reducing the chip area and power consumption of the ADC. This thesis examines a Very Large Scale Integrated (VLSI) design for a folding circuit for a SNS analog preprocessing architecture in a 9-bit folding ADC with a total of 23 comparators. The analog folding circuit layout uses the Orbit 2 micrometers CMOS N-well double-metal, double-poly low-noise analog process. The effects of Spice level 2 parameter tolerances during fabrication on the operation of the folding circuit are investigated numerically. The frequency response of the circuit is also quantified. An Application Specific Integrated Circuit (ASIC) is designed.

  1. Radar signal pre-processing to suppress surface bounce and multipath

    Science.gov (United States)

    Paglieroni, David W; Mast, Jeffrey E; Beer, N. Reginald

    2013-12-31

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes that return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  2. Preprocessing, classification modeling and feature selection using flow injection electrospray mass spectrometry metabolite fingerprint data.

    Science.gov (United States)

    Enot, David P; Lin, Wanchang; Beckmann, Manfred; Parker, David; Overy, David P; Draper, John

    2008-01-01

    Metabolome analysis by flow injection electrospray mass spectrometry (FIE-MS) fingerprinting generates measurements relating to large numbers of m/z signals. Such data sets often exhibit high variance with a paucity of replicates, thus providing a challenge for data mining. We describe data preprocessing and modeling methods that have proved reliable in projects involving samples from a range of organisms. The protocols interact with software resources specifically for metabolomics provided in a Web-accessible data analysis package FIEmspro (http://users.aber.ac.uk/jhd) written in the R environment and requiring a moderate knowledge of R command-line usage. Specific emphasis is placed on describing the outcome of modeling experiments using FIE-MS data that require further preprocessing to improve quality. The salient features of both poor and robust (i.e., highly generalizable) multivariate models are outlined together with advice on validating classifiers and avoiding false discovery when seeking explanatory variables.

  3. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  4. A Hybrid System based on Multi-Agent System in the Data Preprocessing Stage

    CERN Document Server

    Kularbphettong, Kobkul; Meesad, Phayung

    2010-01-01

    We describe the usage of the Multi-agent system in the data preprocessing stage of an on-going project, called e-Wedding. The aim of this project is to utilize MAS and various approaches, like Web services, Ontology, and Data mining techniques, in e-Business that want to improve responsiveness and efficiency of systems so as to extract customer behavior model on Wedding Businesses. However, in this paper, we propose and implement the multi-agent-system, based on JADE, to only cope data preprocessing stage specified on handle with missing value techniques. JADE is quite easy to learn and use. Moreover, it supports many agent approaches such as agent communication, protocol, behavior and ontology. This framework has been experimented and evaluated in the realization of a simple, but realistic. The results, though still preliminary, are quite.

  5. Input data preprocessing method for exchange rate forecasting via neural network

    Directory of Open Access Journals (Sweden)

    Antić Dragan S.

    2014-01-01

    Full Text Available The aim of this paper is to present a method for neural network input parameters selection and preprocessing. The purpose of this network is to forecast foreign exchange rates using artificial intelligence. Two data sets are formed for two different economic systems. Each system is represented by six categories with 70 economic parameters which are used in the analysis. Reduction of these parameters within each category was performed by using the principal component analysis method. Component interdependencies are established and relations between them are formed. Newly formed relations were used to create input vectors of a neural network. The multilayer feed forward neural network is formed and trained using batch training. Finally, simulation results are presented and it is concluded that input data preparation method is an effective way for preprocessing neural network data. [Projekat Ministarstva nauke Republike Srbije, br.TR 35005, br. III 43007 i br. III 44006

  6. The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements

    Science.gov (United States)

    Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry; Watkins, Michael; Yuan, Dah-Ning

    2013-01-01

    The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.

  7. The impact of data preprocessing in traumatic brain injury detection using functional magnetic resonance imaging.

    Science.gov (United States)

    Vergara, Victor M; Damaraju, Eswar; Mayer, Andrew B; Miller, Robyn; Cetin, Mustafa S; Calhoun, Vince

    2015-01-01

    Traumatic brain injury (TBI) can adversely affect a person's thinking, memory, personality and behavior. For this reason new and better biomarkers are being investigated. Resting state functional network connectivity (rsFNC) derived from functional magnetic resonance (fMRI) imaging is emerging as a possible biomarker. One of the main concerns with this technique is the appropriateness of methods used to correct for subject movement. In this work we used 50 mild TBI patients and matched healthy controls to explore the outcomes obtained from different fMRI data preprocessing. Results suggest that correction for motion variance before spatial smoothing is the best alternative. Following this preprocessing option a significant group difference was found between cerebellum and supplementary motor area/paracentral lobule. In this case the mTBI group exhibits an increase in rsFNC.

  8. KONFIG and REKONFIG: Two interactive preprocessing to the Navy/NASA Engine Program (NNEP)

    Science.gov (United States)

    Fishbach, L. H.

    1981-01-01

    The NNEP is a computer program that is currently being used to simulate the thermodynamic cycle performance of almost all types of turbine engines by many government, industry, and university personnel. The NNEP uses arrays of input data to set up the engine simulation and component matching method as well as to describe the characteristics of the components. A preprocessing program (KONFIG) is described in which the user at a terminal on a time shared computer can interactively prepare the arrays of data required. It is intended to make it easier for the occasional or new user to operate NNEP. Another preprocessing program (REKONFIG) in which the user can modify the component specifications of a previously configured NNEP dataset is also described. It is intended to aid in preparing data for parametric studies and/or studies of similar engines such a mixed flow turbofans, turboshafts, etc.

  9. b-Bit Minwise Hashing in Practice: Large-Scale Batch and Online Learning and Using GPUs for Fast Preprocessing with Simple Hash Functions

    CERN Document Server

    Li, Ping; Konig, Arnd Christian

    2012-01-01

    In this paper, we study several critical issues which must be tackled before one can apply b-bit minwise hashing to the volumes of data often used industrial applications, especially in the context of search. 1. (b-bit) Minwise hashing requires an expensive preprocessing step that computes k (e.g., 500) minimal values after applying the corresponding permutations for each data vector. We developed a parallelization scheme using GPUs and observed that the preprocessing time can be reduced by a factor of 20-80 and becomes substantially smaller than the data loading time. 2. One major advantage of b-bit minwise hashing is that it can substantially reduce the amount of memory required for batch learning. However, as online algorithms become increasingly popular for large-scale learning in the context of search, it is not clear if b-bit minwise yields significant improvements for them. This paper demonstrates that $b$-bit minwise hashing provides an effective data size/dimension reduction scheme and hence it can d...

  10. Land 3D-Seismic Data: Preprocessing Quality Control Utilizing Survey Design Specifications, Noise Properties, Normal Moveout, First Breaks, and Offset

    Institute of Scientific and Technical Information of China (English)

    Abdelmoneam Raef

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, Justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from n CO2-flood monitoring survey is used for demonstrating QC dlagnostles. An Important by-product of the QC workflow is establishing the number of layers for n refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data.

  11. Land 3D-seismic data: Preprocessing quality control utilizing survey design specifications, noise properties, normal moveout, first breaks, and offset

    Science.gov (United States)

    Raef, A.

    2009-01-01

    The recent proliferation of the 3D reflection seismic method into the near-surface area of geophysical applications, especially in response to the emergence of the need to comprehensively characterize and monitor near-surface carbon dioxide sequestration in shallow saline aquifers around the world, justifies the emphasis on cost-effective and robust quality control and assurance (QC/QA) workflow of 3D seismic data preprocessing that is suitable for near-surface applications. The main purpose of our seismic data preprocessing QC is to enable the use of appropriate header information, data that are free of noise-dominated traces, and/or flawed vertical stacking in subsequent processing steps. In this article, I provide an account of utilizing survey design specifications, noise properties, first breaks, and normal moveout for rapid and thorough graphical QC/QA diagnostics, which are easy to apply and efficient in the diagnosis of inconsistencies. A correlated vibroseis time-lapse 3D-seismic data set from a CO2-flood monitoring survey is used for demonstrating QC diagnostics. An important by-product of the QC workflow is establishing the number of layers for a refraction statics model in a data-driven graphical manner that capitalizes on the spatial coverage of the 3D seismic data. ?? China University of Geosciences (Wuhan) and Springer-Verlag GmbH 2009.

  12. 导弹遥测数据预处理方法研究%Research on telemetry data preprocessing of missile

    Institute of Scientific and Technical Information of China (English)

    张东; 吴晓琳

    2011-01-01

    Data preprocessing is one important step of telemetry data processing. In this paper, data preprocessing is first expounded and discussed. Then some methods of engineering practice are given, also the corresponding rules and formulas are set forth. By many times tries, it shows that these methods are simple and effective with high reliability, which lay the foundation for telemetry data processing.%数据预处理是导弹遥测数据处理工作的重要环节,文中对数据预处理的方法和过程进行系统阐述和研究,给出工程实践方法,并根据试验任务需求制定相应的判定准则和计算公式,大量试验应用证明,这些方法简单易行、可靠性高,为导弹遥测数据处理奠定了基础.

  13. An open-access software platform for the pre-processing of Earth Observation data from the MSG SEVIRI radiometer

    Science.gov (United States)

    Petropoulos, George; Sandric, Ionut; Anagnostopoulos, Vasilios

    2015-04-01

    The Spinning Enhanced Visible and Infrared Imager (SEVIRI) is multispectral sensor that is one of the main instruments on-board the MSG series of platforms. The radiometer is obtaining from a geostationary orbit coverage of Europe every 15 minutes, but it can also acquire data every 5' in the Rapid Scanning Service mode at the expense of coverage. SEVIRI has 12 spectral bands, five of which are operative in the infrared wavelengths. For the purpose of the present document, it should be mentioned that the instrument has a geometrical resolution of 1 km at Nadir for the high-resolution visible channel and 3 km for the other spectral bands. Detailed information on the SEVIRI specification and operation can be found in the EUMETSAT website. A series of data from SEVIRI instrument are currently provided by EUMETSAT at an operational mode, making a significant contribution to weather forecasting and global climate monitoring. Herein, a software tool developed in Python programming language which allows performing basic pre-processing to the raw acquired SEVIRI data from EUMETSAT is presented. Implementation of this tool allows performing key image processing steps on the SEVIRI data, including but not limited data registration, country subsetting, masking and reprojecting to any national or global coordinate systems. SEVIRI data validation with reference data (e.g. from in-situ measurements if available) and generation of new datasets with ordinary linear regressions, are other capabilities. The tool makes use of the present day multicore processors, being able to process fast very large datasets. The practical usefulness of the software tool is also demonstrated using a variety of examples. Our work is significant to the users' community of the model and very timely, given that to our knowledge there is no similar tool available at present to the SEVIRI users' community, particularly so in the light of the wide range of operationally distributed EO products from

  14. The Combined Effect of Filters in ECG Signals for Pre-Processing

    OpenAIRE

    Isha V. Upganlawar; Harshal Chowhan

    2014-01-01

    The ECG signal is abruptly changing and continuous in nature. The heart disease such as paroxysmal of heart, arrhythmia diagnosing, are related with the intelligent health care decision this ECG signal need to be pre-process accurately for further action on it such as extracting the features, wavelet decomposition, distribution of QRS complexes in ECG recordings and related information such as heart rate and RR interval, classification of the signal by using various classifiers etc. Filters p...

  15. Data preprocessing for a vehicle-based localization system used in road traffic applications

    Science.gov (United States)

    Patelczyk, Timo; Löffler, Andreas; Biebl, Erwin

    2016-09-01

    This paper presents a fixed-point implementation of the preprocessing using a field programmable gate array (FPGA), which is required for a multipath joint angle and delay estimation (JADE) used in road traffic applications. This paper lays the foundation for many model-based parameter estimation methods. Here, a simulation of a vehicle-based localization system application for protecting vulnerable road users, which were equipped with appropriate transponders, is considered. For such safety critical applications, the robustness and real-time capability of the localization is particularly important. Additionally, a motivation to use a fixed-point implementation for the data preprocessing is a limited computing power of the head unit of a vehicle. This study aims to process the raw data provided by the localization system used in this paper. The data preprocessing applied includes a wideband calibration of the physical localization system, separation of relevant information from the received sampled signal, and preparation of the incoming data via further processing. Further, a channel matrix estimation was implemented to complete the data preprocessing, which contains information on channel parameters, e.g., the positions of the objects to be located. In the presented case of a vehicle-based localization system application we assume an urban environment, in which multipath propagation occurs. Since most methods for localization are based on uncorrelated signals, this fact must be addressed. Hence, a decorrelation of incoming data stream in terms of a further localization is required. This decorrelation was accomplished by considering several snapshots in different time slots. As a final aspect of the use of fixed-point arithmetic, quantization errors are considered. In addition, the resources and runtime of the presented implementation are discussed; these factors are strongly linked to a practical implementation.

  16. A clinical evaluation of the RNCA study using Fourier filtering as a preprocessing method

    Energy Technology Data Exchange (ETDEWEB)

    Robeson, W.; Alcan, K.E.; Graham, M.C.; Palestro, C.; Oliver, F.H.; Benua, R.S.

    1984-06-01

    Forty-one patients (25 male, 16 female) were studied by Radionuclide Cardangiography (RNCA) in our institution. There were 42 rest studies and 24 stress studies (66 studies total). Sixteen patients were normal, 15 had ASHD, seven had a cardiomyopathy, and three had left-sided valvular regurgitation. Each study was preprocessed using both the standard nine-point smoothing method and Fourier filtering. Amplitude and phase images were also generated. Both preprocessing methods were compared with respect to image quality, border definition, reliability and reproducibility of the LVEF, and cine wall motion interpretation. Image quality and border definition were judged superior by the consensus of two independent observers in 65 of 66 studies (98%) using Fourier filtered data. The LVEF differed between the two processes by greater than .05 in 17 of 66 studies (26%) including five studies in which the LVEF could not be determined using nine-point smoothed data. LV wall motion was normal by both techniques in all control patients by cine analysis. However, cine wall motion analysis using Fourier filtered data demonstrated additional abnormalities in 17 of 25 studies (68%) in the ASHD group, including three uninterpretable studies using nine-point smoothed data. In the cardiomyopathy/valvular heart disease group, ten of 18 studies (56%) had additional wall motion abnormalities using Fourier filtered data (including four uninterpretable studies using nine-point smoothed data). We conclude that Fourier filtering is superior to the nine-point smooth preprocessing method now in general use in terms of image quality, border definition, generation of an LVEF, and cine wall motion analysis. The advent of the array processor makes routine preprocessing by Fourier filtering a feasible technologic advance in the development of the RNCA study.

  17. Pre-Processing and Re-Weighting Jet Images with Different Substructure Variables

    CERN Document Server

    Huynh, Lynn

    2016-01-01

    This work is an extension of Monte Carlo simulation based studies in tagging boosted, hadronically decaying W bosons at a center of mass energy of s = 13 TeV. Two pre-processing techniques used with jet images, translation and rotation, are first examined. The generated jet images for W signal jets and QCD background jets are then rescaled and weighted with five different substructure variables for visual comparison.

  18. Preprocessing techniques to reduce atmospheric and sensor variability in multispectral scanner data.

    Science.gov (United States)

    Crane, R. B.

    1971-01-01

    Multispectral scanner data are potentially useful in a variety of remote sensing applications. Large-area surveys of earth resources carried out by automated recognition processing of these data are particularly important. However, the practical realization of such surveys is limited by a variability in the scanner signals that results in improper recognition of the data. This paper discusses ways by which some of this variability can be removed from the data by preprocessing with resultant improvements in recognition results.

  19. Pre-Processing Noise Cross-Correlations with Equalizing the Network Covariance Matrix Eigen-Spectrum

    Science.gov (United States)

    Seydoux, L.; de Rosny, J.; Shapiro, N.

    2016-12-01

    Theoretically, the extraction of Green functions from noise cross-correlation requires the ambient seismic wavefield to be generated by uncorrelated sources evenly distributed in the medium. Yet, this condition is often not verified. Strong events such as earthquakes often produce highly coherent transient signals. Also, the microseismic noise is generated at specific places on the Earth's surface with source regions often very localized in space. Different localized and persistent seismic sources may contaminate the cross-correlations of continuous records resulting in spurious arrivals or asymmetry and, finally, in biased travel-time measurements. Pre-processing techniques therefore must be applied to the seismic data in order to reduce the effect of noise anisotropy and the influence of strong localized events. Here we describe a pre-processing approach that uses the covariance matrix computed from signals recorded by a network of seismographs. We extend the widely used time and spectral equalization pre-processing to the equalization of the covariance matrix spectrum (i.e., its ordered eigenvalues). This approach can be considered as a spatial equalization. This method allows us to correct for the wavefield anisotropy in two ways: (1) the influence of strong directive sources is substantially attenuated, and (2) the weakly excited modes are reinforced, allowing to partially recover the conditions required for the Green's function retrieval. We also present an eigenvector-based spatial filter used to distinguish between surface and body waves. This last filter is used together with the equalization of the eigenvalue spectrum. We simulate two-dimensional wavefield in a heterogeneous medium with strongly dominating source. We show that our method greatly improves the travel-time measurements obtained from the inter-station cross-correlation functions. Also, we apply the developed method to the USArray data and pre-process the continuous records strongly influenced

  20. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis.

  1. Review of Data Preprocessing Methods for Sign Language Recognition Systems based on Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Zorins Aleksejs

    2016-12-01

    Full Text Available The article presents an introductory analysis of relevant research topic for Latvian deaf society, which is the development of the Latvian Sign Language Recognition System. More specifically the data preprocessing methods are discussed in the paper and several approaches are shown with a focus on systems based on artificial neural networks, which are one of the most successful solutions for sign language recognition task.

  2. Control Evaluation Information System Savings

    Directory of Open Access Journals (Sweden)

    Eddy Sutedjo

    2011-05-01

    Full Text Available The purpose of this research is to evaluate the control of information system savings in the banking and to identify the weaknesses and problem happened in those saving systems. Research method used are book studies by collecting data and information needed and field studies by interview, observation, questioner, and checklist using COBIT method as a standard to assess the information system control of the company. The expected result about the evaluation result that show in the problem happened and recommendation given as the evaluation report and to give a view about the control done by the company. Conclusion took from this research that this banking company has met standards although some weaknesses still exists in the system.Index Terms - Control Information System, Savings

  3. Saving Electricity and Demand Response

    Science.gov (United States)

    Yamaguchi, Nobuyuki

    A lot of people lost their lives in the tremendous earthquake in Tohoku region on March 11. A large capacity of electric power plants in TEPCO area was also damaged and large scale power shortage in this summer is predicted. In this situation, electricity customers are making great effort to save electricity to avoid planned outage. Customers take actions not only by their selves but also by some customers' cooperative movements. All actions taken actually are based on responses to request form the government or voluntary decision. On the other hand, demand response based on a financial stimulus is not observed as an actual behavior. Saving electricity by this demand response only discussed in the newspapers. In this commentary, the events regarding electricity-saving measure after this disaster are described and the discussions on demand response, especially a raise in power rate, are put into shapes in the context of this electricity supply-demand gap.

  4. Shared energy savings (SES) contracting

    Energy Technology Data Exchange (ETDEWEB)

    Aldridge, D.R. Jr. [Army Corps of Engineers, Huntsville, AL (United States)

    1995-11-01

    This paper discusses the use of a Shared Energy Savings (SES) contract as the procurement vehicle to provide, install, and maintain closed-loop ground-coupled heat pumps (CLGCHP`s) for 4,003 family-housing units at Fort Polk, Louisiana. In addition to the requirement relative to heat pumps, the contract allows the energy service company (ESCO) to propose additional projects needed to take full advantage of energy cost-saving opportunities that may exist at Fort Polk. The paper traces the development of the SES contract from feasibility study through development of the request for proposal (RFP) to contract award and implementation. In tracing this development, technical aspects of the project are set forth and various benefits inherent in SES contracting are indicated. The paper concludes that, due to the positive motivation inherent in the shared-savings, as well as partnering aspects of SES contracts, SES contracting is well suited to use as a procurement vehicle.

  5. Data Cleaning In Data Warehouse: A Survey of Data Pre-processing Techniques and Tools

    Directory of Open Access Journals (Sweden)

    Anosh Fatima

    2017-03-01

    Full Text Available A Data Warehouse is a computer system designed for storing and analyzing an organization's historical data from day-to-day operations in Online Transaction Processing System (OLTP. Usually, an organization summarizes and copies information from its operational systems to the data warehouse on a regular schedule and management performs complex queries and analysis on the information without slowing down the operational systems. Data need to be pre-processed to improve quality of data, before storing into data warehouse. This survey paper presents data cleaning problems and the approaches in use currently for preprocessing. To determine which technique of preprocessing is best in what scenario to improve the performance of Data Warehouse is main goal of this paper. Many techniques have been analyzed for data cleansing, using certain evaluation attributes and tested on different kind of data sets. Data quality tools such as YALE, ALTERYX, and WEKA have been used for conclusive results to ready the data in data warehouse and ensure that only cleaned data populates the warehouse, thus enhancing usability of the warehouse. Results of paper can be useful in many future activities like cleansing, standardizing, correction, matching and transformation. This research can help in data auditing and pattern detection in the data.

  6. Super-resolution algorithm based on sparse representation and wavelet preprocessing for remote sensing imagery

    Science.gov (United States)

    Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin

    2017-04-01

    An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.

  7. Desktop Software for Patch-Clamp Raw Binary Data Conversion and Preprocessing

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2011-01-01

    Full Text Available Since raw data recorded by patch-clamp systems are always stored in binary format, electrophysiologists may experience difficulties with patch clamp data preprocessing especially when they want to analyze by custom-designed algorithms. In this study, we present desktop software, called PCDReader, which could be an effective and convenient solution for patch clamp data preprocessing for daily laboratory use. We designed a novel class module, called clsPulseData, to directly read the raw data along with the parameters recorded from HEKA instruments without any other program support. By a graphical user interface, raw binary data files can be converted into several kinds of ASCII text files for further analysis, with several preprocessing options. And the parameters can also be viewed, modified and exported into ASCII files by a user-friendly Explorer style window. The real-time data loading technique and optimized memory management programming makes PCDReader a fast and efficient tool. The compiled software along with the source code of the clsPulseData class module is freely available to academic and nonprofit users.

  8. Learning-based image preprocessing for robust computer-aided detection

    Science.gov (United States)

    Raghupathi, Laks; Devarakota, Pandu R.; Wolf, Matthias

    2013-03-01

    Recent studies have shown that low dose computed tomography (LDCT) can be an effective screening tool to reduce lung cancer mortality. Computer-aided detection (CAD) would be a beneficial second reader for radiologists in such cases. Studies demonstrate that while iterative reconstructions (IR) improve LDCT diagnostic quality, it however degrades CAD performance significantly (increased false positives) when applied directly. For improving CAD performance, solutions such as retraining with newer data or applying a standard preprocessing technique may not be suffice due to high prevalence of CT scanners and non-uniform acquisition protocols. Here, we present a learning-based framework that can adaptively transform a wide variety of input data to boost an existing CAD performance. This not only enhances their robustness but also their applicability in clinical workflows. Our solution consists of applying a suitable pre-processing filter automatically on the given image based on its characteristics. This requires the preparation of ground truth (GT) of choosing an appropriate filter resulting in improved CAD performance. Accordingly, we propose an efficient consolidation process with a novel metric. Using key anatomical landmarks, we then derive consistent feature descriptors for the classification scheme that then uses a priority mechanism to automatically choose an optimal preprocessing filter. We demonstrate CAD prototype∗ performance improvement using hospital-scale datasets acquired from North America, Europe and Asia. Though we demonstrated our results for a lung nodule CAD, this scheme is straightforward to extend to other post-processing tools dedicated to other organs and modalities.

  9. Evaluating the validity of spectral calibration models for quantitative analysis following signal preprocessing.

    Science.gov (United States)

    Chen, Da; Grant, Edward

    2012-11-01

    When paired with high-powered chemometric analysis, spectrometric methods offer great promise for the high-throughput analysis of complex systems. Effective classification or quantification often relies on signal preprocessing to reduce spectral interference and optimize the apparent performance of a calibration model. However, less frequently addressed by systematic research is the affect of preprocessing on the statistical accuracy of a calibration result. The present work demonstrates the effectiveness of two criteria for validating the performance of signal preprocessing in multivariate models in the important dimensions of bias and precision. To assess the extent of bias, we explore the applicability of the elliptic joint confidence region (EJCR) test and devise a new means to evaluate precision by a bias-corrected root mean square error of prediction. We show how these criteria can effectively gauge the success of signal pretreatments in suppressing spectral interference while providing a straightforward means to determine the optimal level of model complexity. This methodology offers a graphical diagnostic by which to visualize the consequences of pretreatment on complex multivariate models, enabling optimization with greater confidence. To demonstrate the application of the EJCR criterion in this context, we evaluate the validity of representative calibration models using standard pretreatment strategies on three spectral data sets. The results indicate that the proposed methodology facilitates the reliable optimization of a well-validated calibration model, thus improving the capability of spectrophotometric analysis.

  10. Characterizing the continuously acquired cardiovascular time series during hemodialysis, using median hybrid filter preprocessing noise reduction.

    Science.gov (United States)

    Wilson, Scott; Bowyer, Andrea; Harrap, Stephen B

    2015-01-01

    The clinical characterization of cardiovascular dynamics during hemodialysis (HD) has important pathophysiological implications in terms of diagnostic, cardiovascular risk assessment, and treatment efficacy perspectives. Currently the diagnosis of significant intradialytic systolic blood pressure (SBP) changes among HD patients is imprecise and opportunistic, reliant upon the presence of hypotensive symptoms in conjunction with coincident but isolated noninvasive brachial cuff blood pressure (NIBP) readings. Considering hemodynamic variables as a time series makes a continuous recording approach more desirable than intermittent measures; however, in the clinical environment, the data signal is susceptible to corruption due to both impulsive and Gaussian-type noise. Signal preprocessing is an attractive solution to this problem. Prospectively collected continuous noninvasive SBP data over the short-break intradialytic period in ten patients was preprocessed using a novel median hybrid filter (MHF) algorithm and compared with 50 time-coincident pairs of intradialytic NIBP measures from routine HD practice. The median hybrid preprocessing technique for continuously acquired cardiovascular data yielded a dynamic regression without significant noise and artifact, suitable for high-level profiling of time-dependent SBP behavior. Signal accuracy is highly comparable with standard NIBP measurement, with the added clinical benefit of dynamic real-time hemodynamic information.

  11. Foveal processing difficulty does not affect parafoveal preprocessing in young readers

    Science.gov (United States)

    Marx, Christina; Hawelka, Stefan; Schuster, Sarah; Hutzler, Florian

    2017-01-01

    Recent evidence suggested that parafoveal preprocessing develops early during reading acquisition, that is, young readers profit from valid parafoveal information and exhibit a resultant preview benefit. For young readers, however, it is unknown whether the processing demands of the currently fixated word modulate the extent to which the upcoming word is parafoveally preprocessed – as it has been postulated (for adult readers) by the foveal load hypothesis. The present study used the novel incremental boundary technique to assess whether 4th and 6th Graders exhibit an effect of foveal load. Furthermore, we attempted to distinguish the foveal load effect from the spillover effect. These effects are hard to differentiate with respect to the expected pattern of results, but are conceptually different. The foveal load effect is supposed to reflect modulations of the extent of parafoveal preprocessing, whereas the spillover effect reflects the ongoing processing of the previous word whilst the reader’s fixation is already on the next word. The findings revealed that the young readers did not exhibit an effect of foveal load, but a substantial spillover effect. The implications for previous studies with adult readers and for models of eye movement control in reading are discussed. PMID:28139718

  12. Energy supply and energy saving in Ukraine

    Directory of Open Access Journals (Sweden)

    V.M. Ilchenko

    2015-09-01

    Full Text Available The article examines the main problems and solutions of energy saving and energy supply in Ukraine. Low energy efficiency has become one of the main factors of the crisis in the Ukrainian economy. The most relevant scientific and methodical approaches to assessment of the level of energy consumption and saving are indicated. The comparative analysis of annual energy use has been made. A potential to solve energy supply problems is strongly correlated with the ability to ensure the innovative development of economy for efficient and economical use of existing and imported energy resources. The ways for reducing of energy resource consumption have been suggested. Creation of technological conditions for the use of alternative energy sources is considered to be rational also. The development of renewable sources of energy (alternative and renewable energy sources will provide a significant effect in reducing the use of traditional energy sources, harmful emissions and greenhouse gas. Under these conditions, increasing of energy efficiency of economy and its competitiveness can be real. Improvement of environmental and social conditions of citizens of the country will mark a positive step towards the EU, and also will cancel some problems of the future generation.

  13. Are Women Empowered to Save?

    Directory of Open Access Journals (Sweden)

    Frances Woolley

    2013-12-01

    Full Text Available Female economic empowerment – rising earnings, increased opportunities, greater labour force participation – has given many women the means to save. The shifting of responsibility for retirement security from employers and governments onto individuals has given women a reason to save. But are women actually saving? In this paper, we explore the relationship between the gender dynamics within a family and the accumulation of wealth. We find that little evidence in support of the conventional wisdom that families with a female financial manager save more and repay their debts more often. We find some evidence that male financial management leads to greater savings, and other evidence suggesting that savings patterns have a complex relationship with intra-family gender dynamics. El empoderamiento económico de la mujer – el aumento de los ingresos, mayores oportunidades, mayor participación laboral – ha dado a muchas mujeres los medios para ahorrar. Al pasar la responsabilidad de los ingresos de la jubilación de los empleadores y el gobierno a los individuos ha dado a las mujeres un motivo para ahorrar. ¿Pero realmente ahorran las mujeres? En este artículo se analizan las relaciones entre las dinámicas de género en una familia, y la acumulación de riqueza. Se ha llegado a la conclusión de que hay poca evidencia que apoye la creencia convencional de que las familias en las que una mujer gestiona las financias ahorran más y devuelven sus créditos más frecuentemente. Se ha encontrado alguna evidencia de que la gestión financiera por varones acarrea mayores ahorros, y otras evidencias que sugieren que los patrones de ahorro tienen una relación compleja con las dinámicas de género dentro de la familia.

  14. Savings, remittances, and return migration.

    Science.gov (United States)

    Merkle, L; Zimmermann, K F

    1992-01-01

    "We use a data set of immigrants to West Germany to simultaneously study both savings and remittances which we relate to individual characteristics, economic variables, migration experiences and remigration plans. Section 2 discusses the basic hypotheses and explains the data. Section 3 presents the empirical study and Section 4 summarizes." The results suggest that "savings and remittances of migrants can be well explained by remigration plans and economic as well as demographic variables. However, the planned future duration of residence in Germany has a negative and significant effect only on remittances."

  15. Saving Face and Group Identity

    DEFF Research Database (Denmark)

    Eriksson, Tor; Mao, Lei; Villeval, Marie-Claire

    2015-01-01

    their self- but also other group members' image. This behavior is frequent even in the absence of group identity. When group identity is more salient, individuals help regardless of whether the least performer is an in-group or an out-group. This suggests that saving others' face is a strong social norm.......Are people willing to sacrifice resources to save one's and others' face? In a laboratory experiment, we study whether individuals forego resources to avoid the public exposure of the least performer in their group. We show that a majority of individuals are willing to pay to preserve not only...

  16. Saving-Based Asset Pricing

    DEFF Research Database (Denmark)

    Dreyer, Johannes Kabderian; Schneider, Johannes; T. Smith, William

    2013-01-01

    This paper explores the implications of a novel class of preferences for the behavior of asset prices. Following a suggestion by Marshall (1920), we entertain the possibility that people derive utility not only from consumption, but also from the very act of saving. These ‘‘saving......-based’’ preferences are related to models of habit formation and the spirit of capitalism, but incorporate the feature that people have anticipatory habits because they care about the future accumulation of wealth. We derive the Euler equations for these preferences and estimate them with GMM. Our estimates suggest...

  17. Energy Savings in a Market Economy

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen

    1998-01-01

    The paper outlines the concept of energy savings as opposed to energy efficency. Afterwards are described briefly the up and down role of energy savings in recent Danish energy policy. It discusses the failure of leaving electricity savings and Integrated Resource Planning to the electricity...... and discussed. This finally leads to the difficulties in combining a more free market for energy in EU with a strong efforts to save energy. Ideas for electricity and heat saving policies are suggested....

  18. Saving Face and Group Identity

    DEFF Research Database (Denmark)

    Eriksson, Tor; Mao, Lei; Villeval, Marie-Claire

    2015-01-01

    Are people willing to sacrifice resources to save one's and others' face? In a laboratory experiment, we study whether individuals forego resources to avoid the public exposure of the least performer in their group. We show that a majority of individuals are willing to pay to preserve not only th...

  19. Save the Boulders Beach Penguins

    Science.gov (United States)

    Sheerer, Katherine; Schnittka, Christine

    2012-01-01

    Maybe it's the peculiar way they walk or their cute little suits, but students of all ages are drawn to penguins. To meet younger students' curiosity, the authors adapted a middle-school level, penguin-themed curriculum unit called Save the Penguins (Schnittka, Bell, and Richards 2010) for third-grade students. The students loved learning about…

  20. Fong's: Saving Water in Dyeing

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    @@ In an effort to save the precious water resource and reduce the environmental impact, Fong's Industries Group along with its member companies, namely "Fong's National", "THEN", "Goller" and "Fong's Water Technology" provide an ecological dyeing solution to reduce the water consumption drastically through their innovative technologies covering the processes from yarn dyeing to piece dyeing and recycling of discharge after dyeing and finishing.

  1. Social Capital and Savings Behaviour

    DEFF Research Database (Denmark)

    Newman, Carol; Tarp, Finn; Van Den Broeck, Katleen

    We explore the extent to which social capital can play a role in imparting information about the returns to saving where potential knowledge gaps and mistrust exists. Using data from Vietnam we find strong evidence to support the hypothesis that information transmitted via reputable social...

  2. Save Our Streams and Waterways.

    Science.gov (United States)

    Indiana State Dept. of Education, Indianapolis. Center for School Improvement and Performance.

    Protection of existing water supplies is critical to ensuring good health for people and animals alike. This program is aligned with the Izaak Walton League of American's Save Our Streams program which is based on the concept that students can greatly improve the quality of a nearby stream, pond, or river by regular visits and monitoring. The…

  3. Durable consumption, saving and retirement

    DEFF Research Database (Denmark)

    Andersen, Torben M.; Hermansen, Mikkel Nørlem

    2014-01-01

    welfare in stationary equilibrium, we find that a redution in wealth locking-in in durables is not necessarily welfare improving due to the effects on bequest. From a social welfare perspective, individuals tend to choose too much financial savings, too little durable acquisition and too early retirement....

  4. Large Hospital 50% Energy Savings: Technical Support Document

    Energy Technology Data Exchange (ETDEWEB)

    Bonnema, E.; Studer, D.; Parker, A.; Pless, S.; Torcellini, P.

    2010-09-01

    This Technical Support Document documents the technical analysis and design guidance for large hospitals to achieve whole-building energy savings of at least 50% over ANSI/ASHRAE/IESNA Standard 90.1-2004 and represents a step toward determining how to provide design guidance for aggressive energy savings targets. This report documents the modeling methods used to demonstrate that the design recommendations meet or exceed the 50% goal. EnergyPlus was used to model the predicted energy performance of the baseline and low-energy buildings to verify that 50% energy savings are achievable. Percent energy savings are based on a nominal minimally code-compliant building and whole-building, net site energy use intensity. The report defines architectural-program characteristics for typical large hospitals, thereby defining a prototype model; creates baseline energy models for each climate zone that are elaborations of the prototype models and are minimally compliant with Standard 90.1-2004; creates a list of energy design measures that can be applied to the prototype model to create low-energy models; uses industry feedback to strengthen inputs for baseline energy models and energy design measures; and simulates low-energy models for each climate zone to show that when the energy design measures are applied to the prototype model, 50% energy savings (or more) are achieved.

  5. HEp-2 Cell Classification: The Role of Gaussian Scale Space Theory as A Pre-processing Approach

    OpenAIRE

    Qi, Xianbiao; Zhao, Guoying; Chen, Jie; Pietikäinen, Matti

    2015-01-01

    \\textit{Indirect Immunofluorescence Imaging of Human Epithelial Type 2} (HEp-2) cells is an effective way to identify the presence of Anti-Nuclear Antibody (ANA). Most existing works on HEp-2 cell classification mainly focus on feature extraction, feature encoding and classifier design. Very few efforts have been devoted to study the importance of the pre-processing techniques. In this paper, we analyze the importance of the pre-processing, and investigate the role of Gaussian Scale Space (GS...

  6. Complex and magnitude-only preprocessing of 2D and 3D BOLD fMRI data at 7 T.

    Science.gov (United States)

    Barry, Robert L; Strother, Stephen C; Gore, John C

    2012-03-01

    A challenge to ultra high field functional magnetic resonance imaging is the predominance of noise associated with physiological processes unrelated to tasks of interest. This degradation in data quality may be partially reversed using a series of preprocessing algorithms designed to retrospectively estimate and remove the effects of these noise sources. However, such algorithms are routinely validated only in isolation, and thus consideration of their efficacies within realistic preprocessing pipelines and on different data sets is often overlooked. We investigate the application of eight possible combinations of three pseudo-complementary preprocessing algorithms - phase regression, Stockwell transform filtering, and retrospective image correction - to suppress physiological noise in 2D and 3D functional data at 7 T. The performance of each preprocessing pipeline was evaluated using data-driven metrics of reproducibility and prediction. The optimal preprocessing pipeline for both 2D and 3D functional data included phase regression, Stockwell transform filtering, and retrospective image correction. This result supports the hypothesis that a complex preprocessing pipeline is preferable to a magnitude-only pipeline, and suggests that functional magnetic resonance imaging studies should retain complex images and externally monitor subjects' respiratory and cardiac cycles so that these supplementary data may be used to retrospectively reduce noise and enhance overall data quality.

  7. Energy Savings Measure Packages: Existing Homes

    Energy Technology Data Exchange (ETDEWEB)

    Casey, S.; Booten, C.

    2011-11-01

    This document presents the most cost effective Energy Savings Measure Packages (ESMP) for existing mixed-fuel and all electric homes to achieve 15% and 30% savings for each BetterBuildings grantee location across the US. These packages are optimized for minimum cost to homeowners for given source energy savings given the local climate and prevalent building characteristics (i.e. foundation types). Maximum cost savings are typically found between 30% and 50% energy savings over the reference home. The dollar value of the maximum annual savings varies significantly by location but typically amounts to $300 - $700/year.

  8. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    Science.gov (United States)

    D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina

    2016-02-01

    In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  9. AN ENHANCED PRE-PROCESSING RESEARCH FRAMEWORK FOR WEB LOG DATA USING A LEARNING ALGORITHM

    Directory of Open Access Journals (Sweden)

    V.V.R. Maheswara Rao

    2011-01-01

    Full Text Available With the continued growth and proliferation of Web services and Web based information systems, the volumes of user data have reached astronomical proportions. Before analyzing such data using web mining techniques, the web log has to be pre processed, integrated and transformed. As the World Wide Web is continuously and rapidly growing, it is necessary for the web miners to utilize intelligent tools in order to find, extract, filter and evaluate the desired information. The data pre-processing stage is the most important phase for investigation of the web user usage behaviour. To do this one must extract the only human user accesses from weblog data which is critical and complex. The web log is incremental in nature, thus conventional data pre-processing techniques were proved to be not suitable. Hence an extensive learning algorithm is required in order to get the desired information.This paper introduces an extensive research frame work capable of pre processing web log data completely and efficiently. The learning algorithm of proposed research frame work can separates human user and search engine accesses intelligently, with less time. In order to create suitable target data, the further essential tasks of pre-processing Data Cleansing, User Identification, Sessionization and Path Completion are designed collectively. The framework reduces the error rate and improves significant learning performance of the algorithm. The work ensures the goodness of split by using popular measures like Entropy and Gini index. This framework helps to investigate the web user usage behaviour efficiently. The experimental results proving this claim are given in this paper.

  10. EARLINET Single Calculus Chain – technical – Part 1: Pre-processing of raw lidar data

    Directory of Open Access Journals (Sweden)

    G. D'Amico

    2015-10-01

    Full Text Available In this paper we describe an automatic tool for the pre-processing of lidar data called ELPP (EARLINET Lidar Pre-Processor. It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC, the automatic tool for the analysis of EARLINET data. The ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, the ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. The ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of the ELPP module, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of the ELPP module is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of the ELPP module. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. The ELPP module has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  11. Classification-based comparison of pre-processing methods for interpretation of mass spectrometry generated clinical datasets

    Directory of Open Access Journals (Sweden)

    Hoefsloot Huub CJ

    2009-05-01

    Full Text Available Abstract Background Mass spectrometry is increasingly being used to discover proteins or protein profiles associated with disease. Experimental design of mass-spectrometry studies has come under close scrutiny and the importance of strict protocols for sample collection is now understood. However, the question of how best to process the large quantities of data generated is still unanswered. Main challenges for the analysis are the choice of proper pre-processing and classification methods. While these two issues have been investigated in isolation, we propose to use the classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Results Two in-house generated clinical SELDI-TOF MS datasets are used in this study as an example of high throughput mass-spectrometry data. We perform a systematic comparison of two commonly used pre-processing methods as implemented in Ciphergen ProteinChip Software and in the Cromwell package. With respect to reproducibility, Ciphergen and Cromwell pre-processing are largely comparable. We find that the overlap between peaks detected by either Ciphergen ProteinChip Software or Cromwell is large. This is especially the case for the more stringent peak detection settings. Moreover, similarity of the estimated intensities between matched peaks is high. We evaluate the pre-processing methods using five different classification methods. Classification is done in a double cross-validation protocol using repeated random sampling to obtain an unbiased estimate of classification accuracy. No pre-processing method significantly outperforms the other for all peak detection settings evaluated. Conclusion We use classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Both pre-processing methods lead to similar classification results on an ovarian cancer and a Gaucher disease dataset. However, the settings for pre-processing

  12. Energy savings in Polish buildings

    Energy Technology Data Exchange (ETDEWEB)

    Markel, L.C.; Gula, A.; Reeves, G.

    1995-12-31

    A demonstration of low-cost insulation and weatherization techniques was a part of phase 1 of the Krakow Clean Fossil Fuels and Energy Efficient Project. The objectives were to identify a cost-effective set of measures to reduce energy used for space heating, determine how much energy could be saved, and foster widespread implementation of those measures. The demonstration project focused on 4 11-story buildings in a Krakow housing cooperative. Energy savings of over 20% were obtained. Most important, the procedures and materials implemented in the demonstration project have been adapted to Polish conditions and applied to other housing cooperatives, schools, and hospitals. Additional projects are being planned, in Krakow and other cities, under the direction of FEWE-Krakow, the Polish Energie Cities Network, and Biuro Rozwoju Krakowa.

  13. Measure Guideline. Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States); Easley, S. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States)

    2012-05-01

    This measure guideline evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provides a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.

  14. Measure Guideline: Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A.; Easley, S.

    2012-05-01

    The report evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provide a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.

  15. Energy saving in milk processing

    Directory of Open Access Journals (Sweden)

    M. Janzekovic

    2009-04-01

    Full Text Available Purpose: of this paper is to present the justification of replacement of the obsolete system for milk pasteurization and washing of the production line by the newer CIP system (cleaning in place in the dairy. The latter ensures reliable washing and sterilization of lines and machines, which is one of the principal prerequisites for the product quality.Design/methodology/approach: The measurements were performed with the installed equipment CIP Module 5111 - 5116. The cleaning equipment is an 8 line satellite system. The SPS control and the visualization take place through the RAS (Remote - Access network. The visualization data are archived and the visualization is connected to the PC network. The worn Alfa Laval pasteur has been replaced by the new Fischer equipment.Findings: The new CIP system assured 43% water saving, if compared with the old equipment. Saving of washing agents (caustic solution, acid amounted to 11.5%. Due to smaller need for energy (gas, electricity the energy costs were reduced by 19%.Research limitations/implications: The modern system for pasteurization and washing is closely connected with energy saving measures. It allows the production of safe milk products in accordance with HACCP (hazard analyses of critical control points and reduces the hazard of injuries with chemicals.Practical implications: For any company the investments are a decisive factor for its growth and development. Modernization of systems for washing of production lines in dairies assures the cost reduction at all levels and the milk processing into high-quality milk products.Originality/value: The new CIP energy saving system has an influence on the costs of the dairy business activities and the reduction of environment burdening. Owing to the use of new equipment allowing 20 second maintaining time of pasteurization the pasteurization temperature has been reduced from 78°C to 76°C and, thus, the profitability of the pasteurization process has

  16. Can Investors Save The Planet?

    Institute of Scientific and Technical Information of China (English)

    MATTHEW PLOWRIGHT

    2008-01-01

    @@ The zoo people packed into a smart function room in Beijing's Kerry Center Hotel did not,at first glance,seem likely candidates to save the planet.The men were decked out in tailored suits and expensive leather shoes; the women wore clicking high heels and twirled designer handbags.Most were venture capitalists,or entrepreneurs searching for seed capital for their new start-ups.The conversation was all about IPOs and profitable exits.

  17. Computation of Production Leadtime Savings

    Science.gov (United States)

    1992-11-01

    cost in order to conform to SAMMS calculations. SAMMS uses the Presutti- Trepp formula, without significant modification, to compute safety levels...This formula was developed in 1970 by Messrs. Victor Presutti and Richard Trepp of Air Force Logistics Command. The Presutti- Trepp safety level formula...and the Presutti- Trepp safety level formula (Appendix D). B-3 4. Stated another way, PLT savings is the difference in TVC due to changed PLT and price

  18. Comparative Evaluation of Preprocessing Freeware on Chromatography/Mass Spectrometry Data for Signature Discovery

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Fraga, Carlos G.

    2014-07-07

    Preprocessing software is crucial for the discovery of chemical signatures in metabolomics, chemical forensics, and other signature-focused disciplines that involve analyzing large data sets from chemical instruments. Here, four freely available and published preprocessing tools known as metAlign, MZmine, SpectConnect, and XCMS were evaluated for impurity profiling using nominal mass GC/MS data and accurate mass LC/MS data. Both data sets were previously collected from the analysis of replicate samples from multiple stocks of a nerve-agent precursor. Each of the four tools had their parameters set for the untargeted detection of chromatographic peaks from impurities present in the stocks. The peak table generated by each preprocessing tool was analyzed to determine the number of impurity components detected in all replicate samples per stock. A cumulative set of impurity components was then generated using all available peak tables and used as a reference to calculate the percent of component detections for each tool, in which 100% indicated the detection of every component. For the nominal mass GC/MS data, metAlign performed the best followed by MZmine, SpectConnect, and XCMS with detection percentages of 83, 60, 47, and 42%, respectively. For the accurate mass LC/MS data, the order was metAlign, XCMS, and MZmine with detection percentages of 80, 45, and 35%, respectively. SpectConnect did not function for the accurate mass LC/MS data. Larger detection percentages were obtained by combining the top performer with at least one of the other tools such as 96% by combining metAlign with MZmine for the GC/MS data and 93% by combining metAlign with XCMS for the LC/MS data. In terms of quantitative performance, the reported peak intensities had average absolute biases of 41, 4.4, 1.3 and 1.3% for SpectConnect, metAlign, XCMS, and MZmine, respectively, for the GC/MS data. For the LC/MS data, the average absolute biases were 22, 4.5, and 3.1% for metAlign, MZmine, and XCMS

  19. A Multi-channel Pre-processing Circuit for Signals from Thermocouple/Thermister

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper,a new developed multi-channel pre-processing circuit for signals from temperature sensor was introduced in brief.This circuit was developed to collect and amplify the signals from temperature sensor.This is a universal circuit.It can be used to process the signals from thermocouples and also used to process signals from thermistors.This circuit was mounted in a standard box(440W×405D×125H mm)as an instrument.The

  20. Experimental examination of similarity measures and preprocessing methods used for image registration

    Science.gov (United States)

    Svedlow, M.; Mcgillem, C. D.; Anuta, P. E.

    1976-01-01

    The criterion used to measure the similarity between images and thus find the position where the images are registered is examined. The three similarity measures considered are the correlation coefficient, the sum of the absolute differences, and the correlation function. Three basic types of preprocessing are then discussed: taking the magnitude of the gradient of the images, thresholding the images at their medians, and thresholding the magnitude of the gradient of the images at an arbitrary level to be determined experimentally. These multitemporal registration techniques are applied to remote imagery of agricultural areas.

  1. Combined principal component preprocessing and n-tuple neural networks for improved classification

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Linneberg, Christian

    2000-01-01

    We present a combined principal component analysis/neural network scheme for classification. The data used to illustrate the method consist of spectral fluorescence recordings from seven different production facilities, and the task is to relate an unknown sample to one of these seven factories....... The data are first preprocessed by performing an individual principal component analysis on each of the seven groups of data. The components found are then used for classifying the data, but instead of making a single multiclass classifier, we follow the ideas of turning a multiclass problem into a number...

  2. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  3. Interest rate prediction: a neuro-hybrid approach with data preprocessing

    Science.gov (United States)

    Mehdiyev, Nijat; Enke, David

    2014-07-01

    The following research implements a differential evolution-based fuzzy-type clustering method with a fuzzy inference neural network after input preprocessing with regression analysis in order to predict future interest rates, particularly 3-month T-bill rates. The empirical results of the proposed model is compared against nonparametric models, such as locally weighted regression and least squares support vector machines, along with two linear benchmark models, the autoregressive model and the random walk model. The root mean square error is reported for comparison.

  4. Reservoir computing with a slowly modulated mask signal for preprocessing using a mutually coupled optoelectronic system

    Science.gov (United States)

    Tezuka, Miwa; Kanno, Kazutaka; Bunsen, Masatoshi

    2016-08-01

    Reservoir computing is a machine-learning paradigm based on information processing in the human brain. We numerically demonstrate reservoir computing with a slowly modulated mask signal for preprocessing by using a mutually coupled optoelectronic system. The performance of our system is quantitatively evaluated by a chaotic time series prediction task. Our system can produce comparable performance with reservoir computing with a single feedback system and a fast modulated mask signal. We showed that it is possible to slow down the modulation speed of the mask signal by using the mutually coupled system in reservoir computing.

  5. Comparative evaluation of preprocessing freeware on chromatography/mass spectrometry data for signature discovery.

    Science.gov (United States)

    Coble, Jamie B; Fraga, Carlos G

    2014-09-01

    Preprocessing software, which converts large instrumental data sets into a manageable format for data analysis, is crucial for the discovery of chemical signatures in metabolomics, chemical forensics, and other signature-focused disciplines. Here, four freely available and published preprocessing tools known as MetAlign, MZmine, SpectConnect, and XCMS were evaluated for impurity profiling using nominal mass GC/MS data and accurate mass LC/MS data. Both data sets were previously collected from the analysis of replicate samples from multiple stocks of a nerve-agent precursor and method blanks. Parameters were optimized for each of the four tools for the untargeted detection, matching, and cataloging of chromatographic peaks from impurities present in the stock samples. The peak table generated by each preprocessing tool was analyzed to determine the number of impurity components detected in all replicate samples per stock and absent in the method blanks. A cumulative set of impurity components was then generated using all available peak tables and used as a reference to calculate the percent of component detections for each tool, in which 100% indicated the detection of every known component present in a stock. For the nominal mass GC/MS data, MetAlign had the most component detections followed by MZmine, SpectConnect, and XCMS with detection percentages of 83, 60, 47, and 41%, respectively. For the accurate mass LC/MS data, the order was MetAlign, XCMS, and MZmine with detection percentages of 80, 45, and 35%, respectively. SpectConnect did not function for the accurate mass LC/MS data. Larger detection percentages were obtained by combining the top performer with at least one of the other tools such as 96% by combining MetAlign with MZmine for the GC/MS data and 93% by combining MetAlign with XCMS for the LC/MS data. In terms of quantitative performance, the reported peak intensities from each tool had averaged absolute biases (relative to peak intensities obtained

  6. Computer-assisted bone age assessment: image preprocessing and epiphyseal/metaphyseal ROI extraction.

    Science.gov (United States)

    Pietka, E; Gertych, A; Pospiech, S; Cao, F; Huang, H K; Gilsanz, V

    2001-08-01

    Clinical assessment of skeletal maturity is based on a visual comparison of a left-hand wrist radiograph with atlas patterns. Using a new digital hand atlas an image analysis methodology is being developed. To assist radiologists in bone age estimation. The analysis starts with a preprocessing function yielding epiphyseal/metaphyseal regions of interest (EMROIs). Then, these regions are subjected to a feature extraction function. Accuracy has been measured independently at three stages of the image analysis: detection of phalangeal tip, extraction of the EMROIs, and location of diameters and lower edge of the EMROIs. Extracted features describe the stage of skeletal development more objectively than visual comparison.

  7. Mapping of electrical potentials from the chest surface - preprocessing and visualization

    Directory of Open Access Journals (Sweden)

    Vaclav Chudacek

    2005-01-01

    Full Text Available The aim of the paper is to present current research activity in the area of computer supported ECG processing. Analysis of heart electric field based on standard 12lead system is at present the most frequently used method of heart diseasediagnostics. However body surface potential mapping (BSPM that measures electric potentials from several tens to hundreds of electrodes placed on thorax surface has in certain cases higher diagnostic value given by data collection in areas that are inaccessible for standard 12lead ECG. For preprocessing, wavelet transform is used; it allows detect significant values on the ECG signal. Several types of maps, namely immediate potential, integral, isochronous, and differential.

  8. How to save money on medicines

    Science.gov (United States)

    ... medlineplus.gov/ency/patientinstructions/000863.htm How to save money on medicines To use the sharing features on ... can look out for you, recommend ways to save money, and make sure all the drugs you take ...

  9. 75 FR 31673 - Truth in Savings

    Science.gov (United States)

    2010-06-04

    ... CFR Part 230 Truth in Savings AGENCY: Board of Governors of the Federal Reserve System. ACTION: Final... implements the Truth in Savings Act, and the official staff commentary to the regulation. The final rule... adopted a final rule amending Regulation DD, which implements the Truth in Savings Act, and the official...

  10. 75 FR 9126 - Truth in Savings

    Science.gov (United States)

    2010-03-01

    ... CFR Part 230 Truth in Savings AGENCY: Board of Governors of the Federal Reserve System. ACTION... amending Regulation DD, which implements the Truth in Savings Act, and the official staff commentary to the..., which implements the Truth in Savings Act, and the official staff commentary to the regulation....

  11. Consumer behaviours: Teaching children to save energy

    Science.gov (United States)

    Grønhøj, Alice

    2016-08-01

    Energy-saving programmes are increasingly targeted at children to encourage household energy conservation. A study involving the assignment of energy-saving interventions to Girl Scouts shows that a child-focused intervention can improve energy-saving behaviours among children and their parents.

  12. Energy Savings in a Market Economy

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen

    1998-01-01

    The paper outlines the concept of energy savings as opposed to energy efficency. Afterwards are described briefly the up and down role of energy savings in recent Danish energy policy. It discusses the failure of leaving electricity savings and Integrated Resource Planning to the electricity...

  13. 10 CFR 436.20 - Net savings.

    Science.gov (United States)

    2010-01-01

    ... Life Cycle Cost Analyses § 436.20 Net savings. For a retrofit project, net savings may be found by subtracting life cycle costs based on the proposed project from life cycle costs based on not having it. For a new building design, net savings is the difference between the life cycle costs of an...

  14. Saving Material with Systematic Process Designs

    Science.gov (United States)

    Kerausch, M.

    2011-08-01

    Global competition is forcing the stamping industry to further increase quality, to shorten time-to-market and to reduce total cost. Continuous balancing between these classical time-cost-quality targets throughout the product development cycle is required to ensure future economical success. In today's industrial practice, die layout standards are typically assumed to implicitly ensure the balancing of company specific time-cost-quality targets. Although die layout standards are a very successful approach, there are two methodical disadvantages. First, the capabilities for tool design have to be continuously adapted to technological innovations; e.g. to take advantage of the full forming capability of new materials. Secondly, the great variety of die design aspects have to be reduced to a generic rule or guideline; e.g. binder shape, draw-in conditions or the use of drawbeads. Therefore, it is important to not overlook cost or quality opportunities when applying die design standards. This paper describes a systematic workflow with focus on minimizing material consumption. The starting point of the investigation is a full process plan for a typical structural part. All requirements are definedaccording to a predefined set of die design standards with industrial relevance are fulfilled. In a first step binder and addendum geometry is systematically checked for material saving potentials. In a second step, blank shape and draw-in are adjusted to meet thinning, wrinkling and springback targets for a minimum blank solution. Finally the identified die layout is validated with respect to production robustness versus splits, wrinkles and springback. For all three steps the applied methodology is based on finite element simulation combined with a stochastical variation of input variables. With the proposed workflow a well-balanced (time-cost-quality) production process assuring minimal material consumption can be achieved.

  15. Experimental evaluation of video preprocessing algorithms for automatic target hand-off

    Science.gov (United States)

    McIngvale, P. H.; Guyton, R. D.

    It is pointed out that the Automatic Target Hand-Off Correlator (ATHOC) hardware has been modified to permit operation in a nonreal-time mode as a programmable laboratory test unit using video recordings as inputs and allowing several preprocessing algorithms to be software programmable. In parallel with this hardware modification effort, an analysis and simulation effort has been underway to help determine which of the many available preprocessing algorithms should be implemented in the ATHOC software. It is noted that videotapes from a current technology airborne target acquisition system and an imaging infrared missile seeker were recorded and used in the laboratory experiments. These experiments are described and the results are presented. A set of standard parameters is found for each case. Consideration of the background in the target scene is found to be important. Analog filter cutoff frequencies of 2.5 MHz for low pass and 300 kHz for high pass are found to give best results. EPNC = 1 is found to be slightly better than EPNC = 0. It is also shown that trilevel gives better results than bilevel.

  16. Selections of data preprocessing methods and similarity metrics for gene cluster analysis

    Institute of Scientific and Technical Information of China (English)

    YANG Chunmei; WAN Baikun; GAO Xiaofeng

    2006-01-01

    Clustering is one of the major exploratory techniques for gene expression data analysis. Only with suitable similarity metrics and when datasets are properly preprocessed, can results of high quality be obtained in cluster analysis. In this study, gene expression datasets with external evaluation criteria were preprocessed as normalization by line, normalization by column or logarithm transformation by base-2, and were subsequently clustered by hierarchical clustering, k-means clustering and self-organizing maps (SOMs) with Pearson correlation coefficient or Euclidean distance as similarity metric. Finally, the quality of clusters was evaluated by adjusted Rand index. The results illustrate that k-means clustering and SOMs have distinct advantages over hierarchical clustering in gene clustering, and SOMs are a bit better than k-means when randomly initialized. It also shows that hierarchical clustering prefers Pearson correlation coefficient as similarity metric and dataset normalized by line. Meanwhile, k-means clustering and SOMs can produce better clusters with Euclidean distance and logarithm transformed datasets. These results will afford valuable reference to the implementation of gene expression cluster analysis.

  17. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  18. [Research on preprocessing method of near-infrared spectroscopy detection of coal ash calorific value].

    Science.gov (United States)

    Zhang, Lin; Lu, Hui-Shan; Yan, Hong-Wei; Gao, Qiang; Wang, Fu-Jie

    2013-12-01

    The calorific value of coal ash is an important indicator to evaluate the coal quality. In the experiment, the effect of spectrum and processing methods such as smoothing, differential processing, multiplicative scatter correction (MSC) and standard normal variate (SNV) in improving the near-infrared diffuse reflection spectrum signal-noise ratio was analyzed first, then partial least squares (PLS) and principal component analysis (PCR) were used to establish the calorific value model of coal ash for the spectrums processed with each preprocessing method respectively. It was found that the model performance can be obviously improved with 5-point smoothing processing, MSC and SNV, in which 5-point smoothing processing has the best effect, the coefficient of association, correction standard deviation and forecast standard deviation are respectively 0.9899, 0.00049 and 0.00052, and when 25-point smoothing processing is adopted, over-smoothing occurs, which worsens the model performance, while the model established with the spectrum after differential preprocessing has no obvious change and the influence on the model is not large.

  19. Satellite Dwarf Galaxies in a Hierarchical Universe: Infall Histories, Group Preprocessing, and Reionization

    CERN Document Server

    Wetzel, Andrew R; Garrison-Kimmel, Shea

    2015-01-01

    In the Local Group, almost all satellite dwarf galaxies that are within the virial radius of the Milky Way (MW) and M31 exhibit strong environmental influence. The orbital histories of these satellites provide the key to understanding the role of the MW/M31 halo, lower-mass groups, and cosmic reionization on the evolution of dwarf galaxies. We examine the virial-infall histories of satellites with M_star = 10 ^ {3 - 9} M_sun using the ELVIS suite of cosmological zoom-in dissipationless simulations of 48 MW/M31-like halos. Satellites at z = 0 fell into the MW/M31 halos typically 5 - 8 Gyr ago at z = 0.5 - 1. However, they first fell into any host halo typically 7 - 10 Gyr ago at z = 0.7 - 1.5. This difference arises because many satellites experienced "group preprocessing" in another host halo, typically of M_vir ~ 10 ^ {10 - 12} M_sun, before falling into the MW/M31 halos. Satellites with lower-mass and/or those closer to the MW/M31 fell in earlier and are more likely to have experienced group preprocessing; ...

  20. Tactile on-chip pre-processing with techniques from artificial retinas

    Science.gov (United States)

    Maldonado-Lopez, R.; Vidal-Verdu, F.; Linan, G.; Roca, E.; Rodriguez-Vazquez, A.

    2005-06-01

    The interest in tactile sensors is increasing as their use in complex unstructured environments is demanded, like in telepresence, minimal invasive surgery, robotics etc. The matrix of pressure data these devices provide can be managed with many image processing algorithms to extract the required information. However, as in the case of vision chips or artificial retinas, problems arise when the array size and the computation complexity increase. Having a look to the skin, the information collected by every mechanoreceptor is not carried to the brain for its processing, but some complex pre-processing is performed to fit the limited throughput of the nervous system. This is specially important for high bandwidth demanding tasks. Experimental works report that neural response of skin mechanoreceptors encodes the change in local shape from an offset level rather than the absolute force or pressure distributions. This is also the behavior of the retina, which implements a spatio-temporal averaging. We propose the same strategy in tactile preprocessing, and we show preliminary results when it faces the detection of the slip, which involves fast real-time processing.

  1. Penggunaan Web Crawler Untuk Menghimpun Tweets dengan Metode Pre-Processing Text Mining

    Directory of Open Access Journals (Sweden)

    Bayu Rima Aditya

    2015-11-01

    Full Text Available Saat ini jumlah data di media sosial sudah terbilang sangat besar, namun jumlah data tersebut masih belum banyak dimanfaatkan atau diolah untuk menjadi sesuatu yang bernilai guna, salah satunya adalah tweets pada media sosial twitter. Paper ini menguraikan hasil penggunaan engine web crawel menggunakan metode pre-processing text mining. Penggunaan engine web crawel itu sendiri bertujuan untuk menghimpun tweets melalui API twitter sebagai data teks tidak terstruktur yang kemudian direpresentasikan kembali kedalam bentuk web. Sedangkan penggunaan metode pre-processing bertujuan untuk menyaring tweets melalui tiga tahap, yaitu cleansing, case folding, dan parsing. Aplikasi yang dirancang pada penelitian ini menggunakan metode pengembangan perangkat lunak yaitu model waterfall dan diimplementasikan dengan bahasa pemrograman PHP. Sedangkan untuk pengujiannya menggunakan black box testing untuk memeriksa apakah hasil perancangan sudah dapat berjalan sesuai dengan harapan atau belum. Hasil dari penelitian ini adalah berupa aplikasi yang dapat mengubah tweets yang telah dihimpun menjadi data yang siap diolah lebih lanjut sesuai dengan kebutuhan user berdasarkan kata kunci dan tanggal pencarian. Hal ini dilakukan karena dari beberapa penelitian terkait terlihat bahwa data pada media sosial khususnya twitter saat ini menjadi tujuan perusahaan atau instansi untuk memahami opini masyarakat

  2. Review of Intelligent Techniques Applied for Classification and Preprocessing of Medical Image Data

    Directory of Open Access Journals (Sweden)

    H S Hota

    2013-01-01

    Full Text Available Medical image data like ECG, EEG and MRI, CT-scan images are the most important way to diagnose disease of human being in precise way and widely used by the physician. Problem can be clearly identified with the help of these medical images. A robust model can classify the medical image data in better way .In this paper intelligent techniques like neural network and fuzzy logic techniques are explored for MRI medical image data to identify tumor in human brain. Also need of preprocessing of medical image data is explored. Classification technique has been used extensively in the field of medical imaging. The conventional method in medical science for medical image data classification is done by human inspection which may result misclassification of data sometime this type of problem identification are impractical for large amounts of data and noisy data, a noisy data may be produced due to some technical fault of the machine or by human errors and can lead misclassification of medical image data. We have collected number of papers based on neural network and fuzzy logic along with hybrid technique to explore the efficiency and robustness of the model for brain MRI data. It has been analyzed that intelligent model along with data preprocessing using principal component analysis (PCA and segmentation may be the competitive model in this domain.

  3. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  4. Fast data preprocessing with Graphics Processing Units for inverse problem solving in light-scattering measurements

    Science.gov (United States)

    Derkachov, G.; Jakubczyk, T.; Jakubczyk, D.; Archer, J.; Woźniak, M.

    2017-07-01

    Utilising Compute Unified Device Architecture (CUDA) platform for Graphics Processing Units (GPUs) enables significant reduction of computation time at a moderate cost, by means of parallel computing. In the paper [Jakubczyk et al., Opto-Electron. Rev., 2016] we reported using GPU for Mie scattering inverse problem solving (up to 800-fold speed-up). Here we report the development of two subroutines utilising GPU at data preprocessing stages for the inversion procedure: (i) A subroutine, based on ray tracing, for finding spherical aberration correction function. (ii) A subroutine performing the conversion of an image to a 1D distribution of light intensity versus azimuth angle (i.e. scattering diagram), fed from a movie-reading CPU subroutine running in parallel. All subroutines are incorporated in PikeReader application, which we make available on GitHub repository. PikeReader returns a sequence of intensity distributions versus a common azimuth angle vector, corresponding to the recorded movie. We obtained an overall ∼ 400 -fold speed-up of calculations at data preprocessing stages using CUDA codes running on GPU in comparison to single thread MATLAB-only code running on CPU.

  5. Evaluation of preprocessing, mapping and postprocessing algorithms for analyzing whole genome bisulfite sequencing data.

    Science.gov (United States)

    Tsuji, Junko; Weng, Zhiping

    2016-11-01

    Cytosine methylation regulates many biological processes such as gene expression, chromatin structure and chromosome stability. The whole genome bisulfite sequencing (WGBS) technique measures the methylation level at each cytosine throughout the genome. There are an increasing number of publicly available pipelines for analyzing WGBS data, reflecting many choices of read mapping algorithms as well as preprocessing and postprocessing methods. We simulated single-end and paired-end reads based on three experimental data sets, and comprehensively evaluated 192 combinations of three preprocessing, five postprocessing and five widely used read mapping algorithms. We also compared paired-end data with single-end data at the same sequencing depth for performance of read mapping and methylation level estimation. Bismark and LAST were the most robust mapping algorithms. We found that Mott trimming and quality filtering individually improved the performance of both read mapping and methylation level estimation, but combining them did not lead to further improvement. Furthermore, we confirmed that paired-end sequencing reduced error rate and enhanced sensitivity for both read mapping and methylation level estimation, especially for short reads and in repetitive regions of the human genome.

  6. Data preprocessing method for liquid chromatography-mass spectrometry based metabolomics.

    Science.gov (United States)

    Wei, Xiaoli; Shi, Xue; Kim, Seongho; Zhang, Li; Patrick, Jeffrey S; Binkley, Joe; McClain, Craig; Zhang, Xiang

    2012-09-18

    A set of data preprocessing algorithms for peak detection and peak list alignment are reported for analysis of liquid chromatography-mass spectrometry (LC-MS)-based metabolomics data. For spectrum deconvolution, peak picking is achieved at the selected ion chromatogram (XIC) level. To estimate and remove the noise in XICs, each XIC is first segmented into several peak groups based on the continuity of scan number, and the noise level is estimated by all the XIC signals, except the regions potentially with presence of metabolite ion peaks. After removing noise, the peaks of molecular ions are detected using both the first and the second derivatives, followed by an efficient exponentially modified Gaussian-based peak deconvolution method for peak fitting. A two-stage alignment algorithm is also developed, where the retention times of all peaks are first transferred into the z-score domain and the peaks are aligned based on the measure of their mixture scores after retention time correction using a partial linear regression. Analysis of a set of spike-in LC-MS data from three groups of samples containing 16 metabolite standards mixed with metabolite extract from mouse livers demonstrates that the developed data preprocessing method performs better than two of the existing popular data analysis packages, MZmine2.6 and XCMS(2), for peak picking, peak list alignment, and quantification.

  7. A Lightweight Data Preprocessing Strategy with Fast Contradiction Analysis for Incremental Classifier Learning

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2015-01-01

    Full Text Available A prime objective in constructing data streaming mining models is to achieve good accuracy, fast learning, and robustness to noise. Although many techniques have been proposed in the past, efforts to improve the accuracy of classification models have been somewhat disparate. These techniques include, but are not limited to, feature selection, dimensionality reduction, and the removal of noise from training data. One limitation common to all of these techniques is the assumption that the full training dataset must be applied. Although this has been effective for traditional batch training, it may not be practical for incremental classifier learning, also known as data stream mining, where only a single pass of the data stream is seen at a time. Because data streams can amount to infinity and the so-called big data phenomenon, the data preprocessing time must be kept to a minimum. This paper introduces a new data preprocessing strategy suitable for the progressive purging of noisy data from the training dataset without the need to process the whole dataset at one time. This strategy is shown via a computer simulation to provide the significant benefit of allowing for the dynamic removal of bad records from the incremental classifier learning process.

  8. Robust symmetrical number system preprocessing for minimizing encoding errors in photonic analog-to-digital converters

    Science.gov (United States)

    Arvizo, Mylene R.; Calusdian, James; Hollinger, Kenneth B.; Pace, Phillip E.

    2011-08-01

    A photonic analog-to-digital converter (ADC) preprocessing architecture based on the robust symmetrical number system (RSNS) is presented. The RSNS preprocessing architecture is a modular scheme in which a modulus number of comparators are used at the output of each Mach-Zehnder modulator channel. The number of comparators with a logic 1 in each channel represents the integer values within each RSNS modulus sequence. When considered together, the integers within each sequence change one at a time at the next code position, resulting in an integer Gray code property. The RSNS ADC has the feature that the maximum nonlinearity is less than a least significant bit (LSB). Although the observed dynamic range (greatest length of combined sequences that contain no ambiguities) of the RSNS ADC is less than the optimum symmetrical number system ADC, the integer Gray code properties make it attractive for error control. A prototype is presented to demonstrate the feasibility of the concept and to show the important RSNS property that the largest nonlinearity is always less than a LSB. Also discussed are practical considerations related to multi-gigahertz implementations.

  9. Energy Savings Measure Packages. Existing Homes

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Booten, Chuck [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2011-11-01

    This document presents the most cost effective Energy Savings Measure Packages (ESMP) for existing mixed-fuel and all electric homes to achieve 15% and 30% savings for each BetterBuildings grantee location across the United States. These packages are optimized for minimum cost to homeowners for source energy savings given the local climate and prevalent building characteristics (i.e. foundation types). Maximum cost savings are typically found between 30% and 50% energy savings over the reference home; this typically amounts to $300 - $700/year.

  10. A preprocessing tool for removing artifact from cardiac RR interval recordings using three-dimensional spatial distribution mapping.

    Science.gov (United States)

    Stapelberg, Nicolas J C; Neumann, David L; Shum, David H K; McConnell, Harry; Hamilton-Craig, Ian

    2016-04-01

    Artifact is common in cardiac RR interval data that is recorded for heart rate variability (HRV) analysis. A novel algorithm for artifact detection and interpolation in RR interval data is described. It is based on spatial distribution mapping of RR interval magnitude and relationships to adjacent values in three dimensions. The characteristics of normal physiological RR intervals and artifact intervals were established using 24-h recordings from 20 technician-assessed human cardiac recordings. The algorithm was incorporated into a preprocessing tool and validated using 30 artificial RR (ARR) interval data files, to which known quantities of artifact (0.5%, 1%, 2%, 3%, 5%, 7%, 10%) were added. The impact of preprocessing ARR files with 1% added artifact was also assessed using 10 time domain and frequency domain HRV metrics. The preprocessing tool was also used to preprocess 69 24-h human cardiac recordings. The tool was able to remove artifact from technician-assessed human cardiac recordings (sensitivity 0.84, SD = 0.09, specificity of 1.00, SD = 0.01) and artificial data files. The removal of artifact had a low impact on time domain and frequency domain HRV metrics (ranging from 0% to 2.5% change in values). This novel preprocessing tool can be used with human 24-h cardiac recordings to remove artifact while minimally affecting physiological data and therefore having a low impact on HRV measures of that data.

  11. Increasing conclusiveness of metabonomic studies by chem-informatic preprocessing of capillary electrophoretic data on urinary nucleoside profiles.

    Science.gov (United States)

    Szymańska, E; Markuszewski, M J; Capron, X; van Nederkassel, A-M; Heyden, Y Vander; Markuszewski, M; Krajka, K; Kaliszan, R

    2007-01-17

    Nowadays, bioinformatics offers advanced tools and procedures of data mining aimed at finding consistent patterns or systematic relationships between variables. Numerous metabolites concentrations can readily be determined in a given biological system by high-throughput analytical methods. However, such row analytical data comprise noninformative components due to many disturbances normally occurring in analysis of biological samples. To eliminate those unwanted original analytical data components advanced chemometric data preprocessing methods might be of help. Here, such methods are applied to electrophoretic nucleoside profiles in urine samples of cancer patients and healthy volunteers. The electrophoretic nucleoside profiles were obtained under following conditions: 100 mM borate, 72.5 mM phosphate, 160 mM SDS, pH 6.7; 25 kV voltage, 30 degrees C temperature; untreated fused silica capillary 70 cm effective length, 50 microm I.D. Different most advanced preprocessing tools were applied for baseline correction, denoising and alignment of electrophoretic data. That approach was compared to standard procedure of electrophoretic peak integration. The best results of preprocessing were obtained after application of the so-called correlation optimized warping (COW) to align the data. The principal component analysis (PCA) of preprocessed data provides a clearly better consistency of the nucleoside electrophoretic profiles with health status of subjects than PCA of peak areas of original data (without preprocessing).

  12. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    Science.gov (United States)

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries

  13. Saving in cycles: how to get people to save more money.

    Science.gov (United States)

    Tam, Leona; Dholakia, Utpal

    2014-02-01

    Low personal savings rates are an important social issue in the United States. We propose and test one particular method to get people to save more money that is based on the cyclical time orientation. In contrast to conventional, popular methods that encourage individuals to ignore past mistakes, focus on the future, and set goals to save money, our proposed method frames the savings task in cyclical terms, emphasizing the present. Across the studies, individuals who used our proposed cyclical savings method, compared with individuals who used a linear savings method, provided an average of 74% higher savings estimates and saved an average of 78% more money. We also found that the cyclical savings method was more efficacious because it increased implementation planning and lowered future optimism regarding saving money.

  14. Value of travel time savings

    DEFF Research Database (Denmark)

    Le Masurier, P.; Polak, J.; Pawlak, Janet

    2015-01-01

    A team of specialist market researchers and Value of Time experts comprising members from SYSTRA, Imperial College London and the Technical University of Denmark has conducted a formal audit and peer review of research undertaken by Arup/ITS Leeds/Accent to derive Value of Travel Time Savings...... Preference (RP) models that were used to derive final Values of Travel Time (VTT). This report contains the findings of our audit and peer review of the procedures adopted by the research team during data collection of the three surveys (SP, RP and Employers Surveys); a peer review of the reported approach...

  15. Energy saving synergies in national energy systems

    DEFF Research Database (Denmark)

    Thellufsen, Jakob Zinck; Lund, Henrik

    2015-01-01

    In the transition towards a 100% renewable energy system, energy savings are essential. The possibility of energy savings through conservation or efficiency increases can be identified in, for instance, the heating and electricity sectors, in industry, and in transport. Several studies point...... to various optimal levels of savings in the different sectors of the energy system. However, these studies do not investigate the idea of energy savings being system dependent. This paper argues that such system dependency is critical to understand, as it does not make sense to analyse an energy saving...... without taking into account the actual benefit of the saving in relation to the energy system. The study therefore identifies a need to understand how saving methods may interact with each other and the system in which they are conducted. By using energy system analysis to do hourly simulation...

  16. The Effects of Tax-Based Saving Incentives On Saving and Wealth

    OpenAIRE

    Engen, Eric M.; William G. Gale; John Karl Scholz

    1996-01-01

    This paper evaluates research examining the effects of tax-based saving incentives on private and national saving. Several" factors make this an unusually difficult problem. First, households that participate in, or are eligible for, saving incentive plans have systematically stronger tastes for saving than other households. Second, the data indicate that households with saving incentives have taken on more debt than other households. Third, significant changes in the 1980s in financial marke...

  17. Data Pre-Processing Method to Remove Interference of Gas Bubbles and Cell Clusters During Anaerobic and Aerobic Yeast Fermentations in a Stirred Tank Bioreactor

    Science.gov (United States)

    Princz, S.; Wenzel, U.; Miller, R.; Hessling, M.

    2014-11-01

    One aerobic and four anaerobic batch fermentations of the yeast Saccharomyces cerevisiae were conducted in a stirred bioreactor and monitored inline by NIR spectroscopy and a transflectance dip probe. From the acquired NIR spectra, chemometric partial least squares regression (PLSR) models for predicting biomass, glucose and ethanol were constructed. The spectra were directly measured in the fermentation broth and successfully inspected for adulteration using our novel data pre-processing method. These adulterations manifested as strong fluctuations in the shape and offset of the absorption spectra. They resulted from cells, cell clusters, or gas bubbles intercepting the optical path of the dip probe. In the proposed data pre-processing method, adulterated signals are removed by passing the time-scanned non-averaged spectra through two filter algorithms with a 5% quantile cutoff. The filtered spectra containing meaningful data are then averaged. A second step checks whether the whole time scan is analyzable. If true, the average is calculated and used to prepare the PLSR models. This new method distinctly improved the prediction results. To dissociate possible correlations between analyte concentrations, such as glucose and ethanol, the feeding analytes were alternately supplied at different concentrations (spiking) at the end of the four anaerobic fermentations. This procedure yielded low-error (anaerobic) PLSR models for predicting analyte concentrations of 0.31 g/l for biomass, 3.41 g/l for glucose, and 2.17 g/l for ethanol. The maximum concentrations were 14 g/l biomass, 167 g/l glucose, and 80 g/l ethanol. Data from the aerobic fermentation, carried out under high agitation and high aeration, were incorporated to realize combined PLSR models, which have not been previously reported to our knowledge.

  18. Models of Energy Saving Systems

    DEFF Research Database (Denmark)

    Nørgård, Jørgen Stig

    1999-01-01

    The paper first describes the concepts and methods around energy saving, such as energy chain, energy services, end-use technologies, secondary energy, etc. Next are discussed the problems of defining and adding energy services and hence end-use energy efficiency or intensity. A section is devote...... service level and technology are demonstrated as the main determinants of future energy consumption. In the concluding remarks, the main flaws of present energy policy and some visions of the future are discussed.......The paper first describes the concepts and methods around energy saving, such as energy chain, energy services, end-use technologies, secondary energy, etc. Next are discussed the problems of defining and adding energy services and hence end-use energy efficiency or intensity. A section is devoted...... to what is termed lifestyle efficiency, including the cultural values and the ability of the economy to provide the services wanted. As explained, integrated resource planning with its optimizing the whole energy chain cannot be combined with sub-optimizing part of it, for instance the supply technology...

  19. Models of Energy Saving Systems

    DEFF Research Database (Denmark)

    Nørgård, Jørgen Stig

    1999-01-01

    The paper first describes the concepts and methods around energy saving, such as energy chain, energy services, end-use technologies, secondary energy, etc. Next are discussed the problems of defining and adding energy services and hence end-use energy efficiency or intensity. A section is devote...... service level and technology are demonstrated as the main determinants of future energy consumption. In the concluding remarks, the main flaws of present energy policy and some visions of the future are discussed.......The paper first describes the concepts and methods around energy saving, such as energy chain, energy services, end-use technologies, secondary energy, etc. Next are discussed the problems of defining and adding energy services and hence end-use energy efficiency or intensity. A section is devoted...... to what is termed lifestyle efficiency, including the cultural values and the ability of the economy to provide the services wanted. As explained, integrated resource planning with its optimizing the whole energy chain cannot be combined with sub-optimizing part of it, for instance the supply technology...

  20. Green PC Saves Human Life

    Directory of Open Access Journals (Sweden)

    Abdulla Shaik

    2012-03-01

    Full Text Available Green computing is the study and practice of using computing resources efficiently. This give idea about reduce the use of hazardous materials maximize energy efficiency during the product's lifetime, and promote recyclability. Green computing can be broadly defined as the problem of reducing the overall carbon footprint of computing and communication infrastructure, such as data centers, by using energy-efficient design and operations. As the environmentalists and energy conservationists ponder over the issue of conserving environment, technologists have come out with a simple solution to let you contribute to the “Go Green” campaign- with the help of Green PCs. By using green computing practices; you can improve energy management, increase energy efficiency, reduce e-waste, and save money. Taking into consideration the popular use of information technology industry, it has to lead a revolution of sorts by turning green in a manner no industry has ever done before. It is worth emphasizing that this “green technology” should not be just about sound bytes to impress activists but concrete action and organizational policy. The plan towards Green PC should include new electronic products and services with optimum efficiency and all possible options towards energy savings and technical issues in high-performance green computing span the spectrum from green infrastructure like energy-efficient buildings, intelligent cooling systems, green power sources for green hardware multi-core computing systems, energy-efficient server design, energy-efficient solid-state storage for green software and applications.

  1. SAVE ENERGY IN TEXTILE SMES

    Directory of Open Access Journals (Sweden)

    SCALIA Mauro

    2016-05-01

    Full Text Available Efficiency and competitiveness in textile and clothing manufacturing sector must take into account the current and future energy challenges. Energy efficiency is a subject of critical importance for the Textile & Clothing industry, for other sectors and for the society in general. EURATEX has initiated Energy Made-to-Measure, an information campaign running until 2016 to empower over 300 textile & clothing companies, notably SMEs, to become more energy efficient. SET( Save Energy in Textile SMEs a collaborative project co-funded within the European Programme Intelligent Energy Europe II helps companies to understand their energy consumption and allows them to compare the sector benchmarks in different production processes. SET has developed the SET tool, Energy Saving and Efficiency Tool, a free of charge tool customized for textile manufacturers. The SET tool is made up of 4 elements: a stand-alone software (SET Tool for self-assessment based on an Excel application; an on-line part (SET tool Web for advanced benchmarking and comparison of the performances across years; a guiding document for the companies and overview of financial incentives and legal obligations regarding energy efficiency. Designed specifically for small and medium enterprises (SMEs, the SET tool enables the evaluation of energy consumption and recommends measures to reduce the consumption. Prior to modifying the company’s production processes and making investments to increase energy efficiency, textile SMEs need to get different type of information, including legal context, economic and technical peculiarities.

  2. A simpler method of preprocessing MALDI-TOF MS data for differential biomarker analysis: stem cell and melanoma cancer studies

    Directory of Open Access Journals (Sweden)

    Tong Dong L

    2011-09-01

    Full Text Available Abstract Introduction Raw spectral data from matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF with MS profiling techniques usually contains complex information not readily providing biological insight into disease. The association of identified features within raw data to a known peptide is extremely difficult. Data preprocessing to remove uncertainty characteristics in the data is normally required before performing any further analysis. This study proposes an alternative yet simple solution to preprocess raw MALDI-TOF-MS data for identification of candidate marker ions. Two in-house MALDI-TOF-MS data sets from two different sample sources (melanoma serum and cord blood plasma are used in our study. Method Raw MS spectral profiles were preprocessed using the proposed approach to identify peak regions in the spectra. The preprocessed data was then analysed using bespoke machine learning algorithms for data reduction and ion selection. Using the selected ions, an ANN-based predictive model was constructed to examine the predictive power of these ions for classification. Results Our model identified 10 candidate marker ions for both data sets. These ion panels achieved over 90% classification accuracy on blind validation data. Receiver operating characteristics analysis was performed and the area under the curve for melanoma and cord blood classifiers was 0.991 and 0.986, respectively. Conclusion The results suggest that our data preprocessing technique removes unwanted characteristics of the raw data, while preserving the predictive components of the data. Ion identification analysis can be carried out using MALDI-TOF-MS data with the proposed data preprocessing technique coupled with bespoke algorithms for data reduction and ion selection.

  3. Comparison of Pre-Processing and Classification Techniques for Single-Trial and Multi-Trial P300-Based Brain Computer Interfaces

    Directory of Open Access Journals (Sweden)

    Chanan S. Syan

    2010-01-01

    guide for practitioners developing single-trial and multi-trial P300-based BCI systems, particularly for selecting appropriate pre-processing agents and classification methodologies for inclusion. The possibilities for future study include the investigation of double-trial and triple-trial P300 system based on the LDA classifier. The time savings of such approaches will still be significant. It is very likely that such systems would benefit from accuracies higher than the one obtained in this study for single-trial LDA (74.19%.

  4. From Voids to Coma: the prevalence of pre-processing in the local Universe

    CERN Document Server

    Cybulski, Ryan; Fazio, Giovanni G; Gutermuth, Robert A

    2014-01-01

    We examine the effects of pre-processing across the Coma Supercluster, including 3505 galaxies over 500 sq deg, by quantifying the degree to which star-forming (SF) activity is quenched as a function of environment. We characterise environment using the complementary techniques of Voronoi Tessellation, to measure the density field, and the Minimal Spanning Tree, to define continuous structures, and so we measure SF activity as a function of local density and the type of environment (cluster, group, filament, and void), and quantify the degree to which environment contributes to quenching of SF activity. Our sample covers over two orders of magnitude in stellar mass (10^8.5 to 10^11 Msun), and consequently we trace the effects of environment on SF activity for dwarf and massive galaxies, distinguishing so-called "mass quenching" from "environment quenching". Environmentally-driven quenching of SF activity, measured relative to the void galaxies, occurs to progressively greater degrees in filaments, groups, and...

  5. Joint preprocesser-based detector for cooperative networks with limited hardware processing capability

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2015-02-01

    In this letter, a joint detector for cooperative communication networks is proposed when the destination has limited hardware processing capability. The transmitter sends its symbols with the help of L relays. As the destination has limited hardware, only U out of L signals are processed and the energy of the remaining relays is lost. To solve this problem, a joint preprocessing based detector is proposed. This joint preprocessor based detector operate on the principles of minimizing the symbol error rate (SER). For a realistic assessment, pilot symbol aided channel estimation is incorporated for this proposed detector. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Our proposed scheme has low computational complexity.

  6. DUAL CHANNEL SPEECH ENHANCEMENT USING HADAMARD-LMS ALGORITHM WITH DCT PREPROCESSING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    D.DEEPA,

    2010-09-01

    Full Text Available Speech enhancement and noise reduction have wide applications in speech processing. They are often employed as a pre-processing stage in various applications. Two points are often required to be considered in signal de-noising applications: eliminating the undesired noise from signal to improve the Signal to noise Ratio(SNR and preserving the shape and characteristics of the original signal. Background noise in speech signal will reduce the speech intelligibility for people with hearing loss especially for sensorineural loss patients. The proposed algorithm describes Hadamard - Least Mean Square algorithm with DCT pre processing technique to improve the SNR and to reduce the mean square error (MSE. The DCT has separable, and energy compaction property. Although the DCT does not separate frequencies, it is a powerful signal decorrelator. It is a real valued function and thus can be effectively used in real-time operation.

  7. Synthetic aperture radar image correlation by use of preprocessing for enhancement of scattering centers.

    Science.gov (United States)

    Khoury, J; Gianino, P D; Woods, C L

    2000-10-15

    We demonstrate that a significant improvement can be obtained in the recognition of complicated synthetic aperture radar images taken from the Moving and Stationary Target Acquisitions and Recognition database. These images typically have a low number of scattering centers and high noise. We first preprocess the images and the templates formed from them so that their scattering centers are enhanced. Our technique can produce high-quality performance in several correlation criteria. For realistic automatic target recognition systems, our approach should make it easy to implement optical recognition systems with binarized data for many different types of correlation filter and should have a great effect on feeding data-compressed (binarized) information into either digital or optical processors.

  8. [Sample preprocessing method for residual quinolones in honey using immunoaffinity resin].

    Science.gov (United States)

    Ihara, Yoshiharu; Kato, Mihoko; Kodaira, Tsukasa; Itoh, Shinji; Terakawa, Mika; Horie, Masakazu; Saito, Koichi; Nakazawa, Hiroyuki

    2009-06-01

    A sample preparation method was developed for determination of quinolones in honey using immunoaffinity resin. For this purpose, an immunoaffinity resin for quinolones was prepared by coupling a quinolone-specific monoclonal antibody to agarose resin. Honey samples diluted with phosphate buffer were reacted with immunoaffinity resin. After the resin was washed, quinolones were eluted with glycine-HCl. Quinolones in the eluate were determined by HPLC with fluorescence detection. No interfering peak was found on the chromatograms of honey samples. The recoveries of quinolones from samples were over 70% at fortification levels of 20 ng/g (for norfloxacin, ciprofloxacin and enrofloxacin) and 10 ng/g (for danofloxacin). The quantification limits of quinolones were 2 ng/g. This sample preprocessing method using immunoaffinity resin was found to be effective and suitable for determining residual quinolones in honey.

  9. Base resolution methylome profiling: considerations in platform selection, data preprocessing and analysis.

    Science.gov (United States)

    Sun, Zhifu; Cunningham, Julie; Slager, Susan; Kocher, Jean-Pierre

    2015-08-01

    Bisulfite treatment-based methylation microarray (mainly Illumina 450K Infinium array) and next-generation sequencing (reduced representation bisulfite sequencing, Agilent SureSelect Human Methyl-Seq, NimbleGen SeqCap Epi CpGiant or whole-genome bisulfite sequencing) are commonly used for base resolution DNA methylome research. Although multiple tools and methods have been developed and used for the data preprocessing and analysis, confusions remains for these platforms including how and whether the 450k array should be normalized; which platform should be used to better fit researchers' needs; and which statistical models would be more appropriate for differential methylation analysis. This review presents the commonly used platforms and compares the pros and cons of each in methylome profiling. We then discuss approaches to study design, data normalization, bias correction and model selection for differentially methylated individual CpGs and regions.

  10. Feasibility investigation of integrated optics Fourier transform devices. [holographic subtraction for multichannel data preprocessing

    Science.gov (United States)

    Verber, C. M.; Vahey, D. W.; Wood, V. E.; Kenan, R. P.; Hartman, N. F.

    1977-01-01

    The possibility of producing an integrated optics data processing device based upon Fourier transformations or other parallel processing techniques, and the ways in which such techniques may be used to upgrade the performance of present and projected NASA systems were investigated. Activities toward this goal include; (1) production of near-diffraction-limited geodesic lenses in glass waveguides; (2) development of grinding and polishing techniques for the production of geodesic lenses in LiNbO3 waveguides; (3) development of a characterization technique for waveguide lenses; and (4) development of a theory for corrected aspheric geodesic lenses. A holographic subtraction system was devised which should be capable of rapid on-board preprocessing of a large number of parallel data channels. The principle involved is validated in three demonstrations.

  11. Data preprocessing method for fluorescence molecular tomography using a priori information provided by CT.

    Science.gov (United States)

    Fu, Jianwei; Yang, Xiaoquan; Meng, Yuanzheng; Luo, Qingming; Gong, Hui

    2012-01-01

    The combined system of micro-CT and fluorescence molecular tomography (FMT) offers a new tool to provide anatomical and functional information of small animals in a single study. To take advantages of the combined system, a data preprocessing method is proposed to extract the valid data for FMT reconstruction algorithms using a priori information provided by CT. The boundary information of the animal and animal holder is extracted from reconstructed CT volume data. A ray tracing method is used to trace the path of the excitation beam, calculate the locations and directions of the optional sources and determine whether the optional sources are valid. To accurately calculate the projections of the detectors on optical images and judge their validity, a combination of perspective projection and inverse ray tracing method are adopted to offer optimal performance. The imaging performance of the combined system with the presented method is validated through experimental rat imaging.

  12. Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum

    Science.gov (United States)

    Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.

    2017-09-01

    Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.

  13. Image pre-processing method for near-wall PIV measurements over moving curved interfaces

    Science.gov (United States)

    Jia, L. C.; Zhu, Y. D.; Jia, Y. X.; Yuan, H. J.; Lee, C. B.

    2017-03-01

    PIV measurements near a moving interface are always difficult. This paper presents a PIV image pre-processing method that returns high spatial resolution velocity profiles near the interface. Instead of re-shaping or re-orientating the interrogation windows, interface tracking and an image transformation are used to stretch the particle image strips near a curved interface into rectangles. Then the adaptive structured interrogation windows can be arranged at specified distances from the interface. Synthetic particles are also added into the solid region to minimize interfacial effects and to restrict particles on both sides of the interface. Since a high spatial resolution is only required in high velocity gradient region, adaptive meshing and stretching of the image strips in the normal direction is used to improve the cross-correlation signal-to-noise ratio (SN) by reducing the velocity difference and the particle image distortion within the interrogation window. A two dimensional Gaussian fit is used to compensate for the effects of stretching particle images. The working hypothesis is that fluid motion near the interface is ‘quasi-tangential flow’, which is reasonable in most fluid-structure interaction scenarios. The method was validated against the window deformation iterative multi-grid scheme (WIDIM) using synthetic image pairs with different velocity profiles. The method was tested for boundary layer measurements of a supersonic turbulent boundary layer on a flat plate, near a rotating blade and near a flexible flapping flag. This image pre-processing method provides higher spatial resolution than conventional WIDIM and good robustness for measuring velocity profiles near moving interfaces.

  14. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  15. Water Saving Strategies & Ecological Modernisation

    DEFF Research Database (Denmark)

    Hoffmann, Birgitte; Jensen, Jesper Ole; Elle, Morten

    2005-01-01

    as a frame for understanding resource manage-ment. The water management in Copenhagen has in recent years undergone a rather radi-cal transition. Along with strong drivers for resource management in the region the mu-nicipal water supplier has tested and implemented a number of initiatives to promote sus......-tainable water management. The paper focuses on the experiences from different water saving initiatives carried out since the mid 80s relating them to some central aspects of Ecological Modernisation theories: · Demands for tools and targets · New tasks and roles for suppliers, consumers and stakeholders...... and the emergence of a new group of intermediary actors · The changing logics of sustainability and the development of storylines The ecological modernist discourse implies a participatory approach, by which citizens are made co-responsible and included in efforts towards a sustainable development; however...

  16. Water Saving Strategies & Ecological Modernisation

    DEFF Research Database (Denmark)

    Hoffmann, Birgitte; Jensen, Jesper Ole; Elle, Morten

    2005-01-01

    -tainable water management. The paper focuses on the experiences from different water saving initiatives carried out since the mid 80s relating them to some central aspects of Ecological Modernisation theories: · Demands for tools and targets · New tasks and roles for suppliers, consumers and stakeholders...... as a frame for understanding resource manage-ment. The water management in Copenhagen has in recent years undergone a rather radi-cal transition. Along with strong drivers for resource management in the region the mu-nicipal water supplier has tested and implemented a number of initiatives to promote sus...... to 125 l/capita/day in 2002. A series of different strategies, targets and tools have been implemented: Emphasizing demand side instead of supply side, using and communicating indicators, formulating goals for reducing water consumption and developing learning processes in water management. A main...

  17. Saving in Asia: Issues for Rebalancing Growth

    OpenAIRE

    Jha, Shikha; Prasad, Eswar; Terada-Hagiwara, Akiko

    2009-01-01

    This paper assesses the role of consumption and saving in Asia’s growth. It examines the composition of national saving, analyzes what forces drive saving rates, and draws policy conclusions from the analysis that are relevant for the economies in the region and which might play an important part in rebalancing global growth. The paper identifies a number of issues. A rapid rise in the profitability of state-owned and private enterprises together with distorted dividend policies and underdeve...

  18. Shared savings gets realtor new water heaters

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, R.

    1983-08-08

    The Grenadier Realty Co. of New York is financing four energy-efficient water heaters for apartment buildings with a shared savings arrangement. The arrangement allows Grenadier to avoid front-end costs, which were paid by Independent Water Heaters Inc. in exchange for a decreasing share of the savings. Grenadier will own the heaters when the five-year contract expires. By allowing a shutdown of boilers during the summer months, the heaters will further increase energy savings. (DCK)

  19. Why Italy's saving rate became (so) low?

    OpenAIRE

    Campiglio, Luigi

    2013-01-01

    The aim of this paper is to explain why a low and declining saving rate should be a problem in a world of free capital flows and increasing wealth. In Italy consumer households’ saving have been the main driver of economic stability and growth, funding investments and public debt, and despite international turbulences Italy was acknowledged as a high saving country until the early 1990’s. Ever since, however, households saving rate plunged, in spite of an increasing financial wealth, and our ...

  20. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  1. Convergence Properties of an Iterative Procedure of Ipsatizing and Standardizing a Data Matrix, with Applications to Parafac/Candecomp Preprocessing.

    Science.gov (United States)

    ten Berge, Jos M. F.; Kiers, Henk A. L.

    1989-01-01

    Centering a matrix row-wise and rescaling it column-wise to a unit sum of squares requires an iterative procedure. It is shown that this procedure converges to a stable solution that need not be centered row-wise. The results bear directly on several types of preprocessing methods in Parafac/Candecomp. (Author/TJH)

  2. The way to collisions, step by step

    CERN Multimedia

    2009-01-01

    While the LHC sectors cool down and reach the cryogenic operating temperature, spirits are warming up as we all eagerly await the first collisions. No reason to hurry, though. Making particles collide involves the complex manoeuvring of thousands of delicate components. The experts will make it happen using a step-by-step approach.

  3. Internship guide : Work placements step by step

    NARCIS (Netherlands)

    Haag, Esther

    2013-01-01

    Internship Guide: Work Placements Step by Step has been written from the practical perspective of a placement coordinator. This book addresses the following questions : what problems do students encounter when they start thinking about the jobs their degree programme prepares them for? How do you

  4. Internship guide : Work placements step by step

    NARCIS (Netherlands)

    Haag, Esther

    2013-01-01

    Internship Guide: Work Placements Step by Step has been written from the practical perspective of a placement coordinator. This book addresses the following questions : what problems do students encounter when they start thinking about the jobs their degree programme prepares them for? How do you fi

  5. On Computational Small Steps and Big Steps

    DEFF Research Database (Denmark)

    Johannsen, Jacob

    rules in the small-step semantics cause the refocusing step of the syntactic correspondence to be inapplicable. Second, we propose two solutions to overcome this in-applicability: backtracking and rule generalization. Third, we show how these solutions affect the other transformations of the two...

  6. Introductory guide to saving energy in the home

    CSIR Research Space (South Africa)

    Billingham, P.A

    1977-01-01

    Full Text Available for each task. C. ENERGY-SAVING STEPS THAT ARE COMPARATIVELY EXPENSIVE Where HEAT is the problem 1. Solar water heaters. 2. lnsulate ceilings or shade roofs. 3. External shades on windows. 4. Reflective glass. 5. Evaporative cooling (not for humid... leak; the window may be perfectly sealed. In summer, when the outdoor air is hotter than the indoor air, just the reverse can happen. The heat is being lost or gained by conduction through the glass. Rule 2, therefore, is to insulate regions where...

  7. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    OpenAIRE

    Jin Xiao; Bing Zhu; Geer Teng; Changzheng He; Dunhu Liu

    2014-01-01

    Scientific customer value segmentation (CVS) is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM) model. On the one hand, ODCEM integrates the preprocess of missing values and the classif...

  8. FY 1995 cost savings report

    Energy Technology Data Exchange (ETDEWEB)

    Andrews-Smith, K.L., Westinghouse Hanford

    1996-06-21

    Fiscal Year (FY) 1995 challenged us to dramatically reduce costs at Hanford. We began the year with an 8 percent reduction in our Environmental Management budget but at the same time were tasked with accomplishing additional workscope. This resulted in a Productivity Challenge whereby we took on more work at the beginning of the year than we had funding to complete. During the year, the Productivity Challenge actually grew to 23 percent because of recissions, Congressional budget reductions, and DOE Headquarters actions. We successfully met our FY 1995 Productivity Challenge through an aggressive cost reduction program that identified and eliminated unnecessary workscope and found ways to be more efficient. We reduced the size of the workforce, cut overhead expenses, eliminated paperwork, cancelled construction of new facilities, and reengineered our processes. We are proving we can get the job done better and for less money at Hanford. DOE`s drive to do it ``better, faster, cheaper`` has led us to look for more and larger partnerships with the private sector. The biggest will be privatization of Hanford`s Tank Waste Remediation System, which will turn liquid tank waste into glass logs for eventual disposal. We will also save millions of dollars and avoid the cost of replacing aging steam plants by contracting Hanford`s energy needs to a private company. Other privatization successes include the Hanford Mail Service, a spinoff of advanced technical training, low level mixed waste thermal treatment, and transfer of the Hanford Museums of Science and history to a private non-profit organization. Despite the rough roads and uncertainty we faced in FY 1995, less than 3 percent of our work fell behind schedule, while the work that was performed was completed with an 8.6 percent cost under-run. We not only met the FY 1995 productivity challenge, we also met our FY 1995-1998 savings commitments and accelerated some critical cleanup milestones. The challenges continue

  9. A comparative analysis of preprocessing techniques of cardiac event series for the study of heart rhythm variability using simulated signals

    Directory of Open Access Journals (Sweden)

    Guimarães H.N.

    1998-01-01

    Full Text Available In the present study, using noise-free simulated signals, we performed a comparative examination of several preprocessing techniques that are used to transform the cardiac event series in a regularly sampled time series, appropriate for spectral analysis of heart rhythm variability (HRV. First, a group of noise-free simulated point event series, which represents a time series of heartbeats, was generated by an integral pulse frequency modulation model. In order to evaluate the performance of the preprocessing methods, the differences between the spectra of the preprocessed simulated signals and the true spectrum (spectrum of the model input modulating signals were surveyed by visual analysis and by contrasting merit indices. It is desired that estimated spectra match the true spectrum as close as possible, showing a minimum of harmonic components and other artifacts. The merit indices proposed to quantify these mismatches were the leakage rate, defined as a measure of leakage components (located outside some narrow windows centered at frequencies of model input modulating signals with respect to the whole spectral components, and the numbers of leakage components with amplitudes greater than 1%, 5% and 10% of the total spectral components. Our data, obtained from a noise-free simulation, indicate that the utilization of heart rate values instead of heart period values in the derivation of signals representative of heart rhythm results in more accurate spectra. Furthermore, our data support the efficiency of the widely used preprocessing technique based on the convolution of inverse interval function values with a rectangular window, and suggest the preprocessing technique based on a cubic polynomial interpolation of inverse interval function values and succeeding spectral analysis as another efficient and fast method for the analysis of HRV signals

  10. Research on Data Preprocessing Technology in Web Log Mining%Web日志挖掘中的数据预处理技术研究

    Institute of Scientific and Technical Information of China (English)

    杨玉梅

    2014-01-01

    Preprocessing is the key of Web log mining, the result of preprocessing has a great influence on rules and pattern produced by mining algorithm, which is key ensuring the quality of Web mining. This paper presents DUI technology, enhance the preprocessing technology. It is proved by experiments, advanced data preprocessing technology may enhance the result quality of data preprocessing .%预处理是Web日志挖掘的重点,预处理的结果对挖掘算法产生的规则与模式有很大的影响,是保证 Web日志挖掘质量的关键。本文提出了DUI技术,增强了预处理技术。并通过实验证明,先进的数据预处理技术可以提高数据预处理的结果质量。

  11. Must losing taxes on saving be harmful?

    DEFF Research Database (Denmark)

    Huizinga, Harry; Nielsen, Søren Bo

    2004-01-01

    Internationalization offers enhanced opportunities for individuals to place savingsabroad and evade domestic saving taxation. This paper asks whether the concomi-tant loss of saving taxation necessarily is harmful. To this end we construct a modelof many symmetric countries in which public goods ...

  12. 16 CFR 460.19 - Savings claims.

    Science.gov (United States)

    2010-01-01

    ... 16 Commercial Practices 1 2010-01-01 2010-01-01 false Savings claims. 460.19 Section 460.19 Commercial Practices FEDERAL TRADE COMMISSION TRADE REGULATION RULES LABELING AND ADVERTISING OF HOME INSULATION § 460.19 Savings claims. (a) If you say or imply in your ads, labels, or other...

  13. Banking Postal Savings Bank in Sight

    Institute of Scientific and Technical Information of China (English)

    WANG PEI

    2006-01-01

    @@ Nine years of controversy regarding a national postal savings bank is expected to finally conclude this year. In April, the China Banking Regulatory Commission (CBRC)announced the establishment of a new department, one of the main functions of which will be to supervise postal savings.

  14. 76 FR 3487 - Truth in Savings

    Science.gov (United States)

    2011-01-20

    ... ADMINISTRATION 12 CFR Part 707 RIN 3133-AD72 Truth in Savings AGENCY: National Credit Union Administration (NCUA). ACTION: Final rule. SUMMARY: On July 22, 2009, NCUA published a final rule amending NCUA's Truth in... through automated systems. This final rule amends NCUA's Truth in Savings rule and official staff...

  15. Can survey participation alter household saving behaviour?

    NARCIS (Netherlands)

    Crossley, Thomas; de Bresser, Jochem; Delaney, L.; Winter, Joachim

    2016-01-01

    We document an effect of survey participation on household saving. Indentification comes from random assignment to modules within a population-representative internet panel. The saving measure is based on linked administrative wealth data. Households that responded to a detailed questionnaire on nee

  16. 78 FR 20097 - Energy Savings Performance Contracts

    Science.gov (United States)

    2013-04-03

    ... savings projects where the up-front capital cost is financed by an Energy Services Company (ESCO), who is... capital costs. In an ESPC, a Federal agency contracts with an ESCO, following a comprehensive energy audit conducted by the ESCO of a Federal facility to identify improvements to save energy. In consultation...

  17. Saving Behavior and Portfolio Choice After Retirement

    NARCIS (Netherlands)

    van Ooijen, Raun; Alessie, Rob; Kalwij, Adriaan

    2015-01-01

    This paper reviews the literature on saving behavior and portfolio choice after retirement and provides a descriptive analysis of this behavior by Dutch elderly households. Studying saving behavior in the Netherlands is informative because of the very different institutional background compared to t

  18. The High Cost of Saving Energy Dollars.

    Science.gov (United States)

    Rose, Patricia

    1985-01-01

    In alternative financing a private company provides the capital and expertise for improving school energy efficiency. Savings are split between the school system and the company. Options for municipal leasing, cost sharing, and shared savings are explained along with financial, procedural, and legal considerations. (MLF)

  19. Energy-saving motor; Energiesparmotor

    Energy Technology Data Exchange (ETDEWEB)

    Lindegger, M.

    2002-07-01

    This report for the Swiss Federal Office of Energy (SFOE) describes the development and testing of an advanced electrical motor using a permanent-magnet rotor. The aims of the project - to study the technical feasibility and market potential of the Eco-Motor - are discussed and the three phases of the project described. These include the calculation and realisation of a 250-watt prototype operating at 230 V, the measurement of the motor's characteristics as well as those of a comparable asynchronous motor on the test bed at the University of Applied Science in Lucerne, Switzerland, and a market study to establish if the Eco-Motor and its controller can compete against normal asynchronous motors. Also, the results of an analysis of the energy-savings potential is made, should such Eco-Motors be used. Detailed results of the three phases of the project are presented and the prospects of producing such motors in Switzerland for home use as well as for export are examined.

  20. Energy-saving motor; Energiesparmotor

    Energy Technology Data Exchange (ETDEWEB)

    Lindegger, M.

    2002-07-01

    This report for the Swiss Federal Office of Energy (SFOE) describes the development and testing of an advanced electrical motor using a permanent-magnet rotor. The aims of the project - to study the technical feasibility and market potential of the Eco-Motor - are discussed and the three phases of the project described. These include the calculation and realisation of a 250-watt prototype operating at 230 V, the measurement of the motor's characteristics as well as those of a comparable asynchronous motor on the test bed at the University of Applied Science in Lucerne, Switzerland, and a market study to establish if the Eco-Motor and its controller can compete against normal asynchronous motors. Also, the results of an analysis of the energy-savings potential is made, should such Eco-Motors be used. Detailed results of the three phases of the project are presented and the prospects of producing such motors in Switzerland for home use as well as for export are examined.

  1. Saving oil in a hurry

    Energy Technology Data Exchange (ETDEWEB)

    none

    2005-07-01

    During 2004, oil prices reached levels unprecedented in recent years. Though world oil markets remain adequately supplied, high oil prices do reflect increasingly uncertain conditions. Many IEA member countries and non-member countries alike are looking for ways to improve their capability to handle market volatility and possible supply disruptions in the future. This book aims to provide assistance. It provides a new, quantitative assessment of the potential oil savings and costs of rapid oil demand restraint measures for transport. Some measures may make sense under any circumstances; others are primarily useful in emergency situations. All can be implemented on short notice ? if governments are prepared. The book examines potential approaches for rapid uptake of telecommuting, ?ecodriving?, and car-pooling, among other measures. It also provides methodologies and data that policymakers can use to decide which measures would be best adapted to their national circumstances. This ?tool box? may help countries to complement other measures for coping with supply disruptions, such as use of strategic oil stocks.

  2. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  3. Housing-related lifestyle and energy saving

    DEFF Research Database (Denmark)

    Thøgersen, John

    2017-01-01

    A new instrument for measuring housing-related lifestyle (HRL) is introduced and employed for identifying national and cross-national HRL segments in 10 European countries (N = 3190). The identified HRL segments are profiled and the practical importance of HRL for everyday energy-saving efforts...... in the home and for the energy-consumer’s openness to new energy saving opportunities (i.e., energy saving innovativeness) is investigated. The HRL instrument’s 71 items load on 16 dimensions within five lifestyle elements. Multi-group confirmatory factor analysis reveals that the instrument possesses metric...... of relevant background characteristics. A multivariate GLM analysis reveals that when differences in housing-related lifestyles are controlled, neither country of residence nor the interaction between lifestyle and country of residence influence energy saving innovativeness or everyday energy-saving efforts...

  4. Chapter 17: Estimating Net Savings: Common Practices

    Energy Technology Data Exchange (ETDEWEB)

    Violette, D. M.; Rathbun, P.

    2014-09-01

    This chapter focuses on the methods used to estimate net energy savings in evaluation, measurement, and verification (EM&V) studies for energy efficiency (EE) programs. The chapter provides a definition of net savings, which remains an unsettled topic both within the EE evaluation community and across the broader public policy evaluation community, particularly in the context of attribution of savings to particular program. The chapter differs from the measure-specific Uniform Methods Project (UMP) chapters in both its approach and work product. Unlike other UMP resources that provide recommended protocols for determining gross energy savings, this chapter describes and compares the current industry practices for determining net energy savings, but does not prescribe particular methods.

  5. Risk transfer via energy savings insurance

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Evan

    2001-10-01

    Among the key barriers to investment in energy efficiency improvements are uncertainties about attaining projected energy savings and apprehension about potential disputes over these savings. The fields of energy management and risk management are thus intertwined. While many technical methods have emerged to manage performance risks (e.g. building commissioning), financial risk transfer techniques are less developed in the energy management arena than in other more mature segments of the economy. Energy Savings Insurance (ESI) - formal insurance of predicted energy savings - is one method of transferring financial risks away from the facility owner or energy services contractor. ESI offers a number of significant advantages over other forms of financial risk transfer, e.g. savings guarantees or performance bonds. ESI providers manage risk via pre-construction design review as well as post-construction commissioning and measurement and verification of savings. We found that the two mos t common criticisms of ESI - excessive pricing and onerous exclusions - are not born out in practice. In fact, if properly applied, ESI can potentially reduce the net cost of energy savings projects by reducing the interest rates charged by lenders, and by increasing the level of savings through quality control. Debt service can also be ensured by matching loan payments to projected energy savings while designing the insurance mechanism so that payments are made by the insurer in the event of a savings shortfall. We estimate the U.S. ESI market potential of $875 million/year in premium income. From an energy-policy perspective, ESI offers a number of potential benefits: ESI transfers performance risk from the balance sheet of the entity implementing the energy savings project, thereby freeing up capital otherwise needed to ''self-insure'' the savings. ESI reduces barriers to market entry of smaller energy services firms who do not have sufficiently strong balance

  6. Saving electricity in a hurry - update 2011

    Energy Technology Data Exchange (ETDEWEB)

    Pasquier, Sara Bryan

    2011-06-15

    As demonstrated by the March 2011 earthquake and tsunami-triggered blackouts in Japan, electricity shortfalls can happen anytime and anywhere. Countries can minimise the negative economic, social and environmental impacts of such electricity shortfalls by developing emergency energy-saving strategies before a crisis occurs. This new IEA report highlights preliminary findings and conclusions from electricity shortfalls in Japan, the United States, New Zealand, South Africa and Chile. It draws on recent analysis to: reinforce well-established guidelines on diagnosing electricity shortfalls, identifying energy-saving opportunities and selecting a package of energy-saving measures; and highlight proven practice for implementing emergency energy-saving programmes. This paper will be valuable to government, academic, private-sector and civil-society stakeholders who inform, develop and implement electricity policy in general, and emergency energy-saving programmes in particular.

  7. ONU Power Saving Scheme for EPON System

    Science.gov (United States)

    Mukai, Hiroaki; Tano, Fumihiko; Tanaka, Masaki; Kozaki, Seiji; Yamanaka, Hideaki

    PON (Passive Optical Network) achieves FTTH (Fiber To The Home) economically, by sharing an optical fiber among plural subscribers. Recently, global climate change has been recognized as a serious near term problem. Power saving techniques for electronic devices are important. In PON system, the ONU (Optical Network Unit) power saving scheme has been studied and defined in XG-PON. In this paper, we propose an ONU power saving scheme for EPON. Then, we present an analysis of the power reduction effect and the data transmission delay caused by the ONU power saving scheme. According to the analysis, we propose an efficient provisioning method for the ONU power saving scheme which is applicable to both of XG-PON and EPON.

  8. Understanding High Saving Rate in China

    Institute of Scientific and Technical Information of China (English)

    Xinhua He; Yongfu Cao

    2007-01-01

    This paper presents a detailed analysis of the Chinese saving rate based on the flow of funds data. It finds that the most widely adopted view of precautionary saving, which is regarded as the top reason for maintaining a high saving rate in China, is misleading because this conclusion is drawn from the household survey data. In fact, the household saving rate has declined dramatically since the mid-1990s, as is observed from the flow of funds framework.The high national saving rate is attributed to the increasing shares of both government and corporation disposable incomes. Insufficient consumption demand is caused by the persistent decrease in percentage share of household to national disposable income. Governmentdirected income redistribution urgently needs to be improved to accelerate consumption,which in turn would make the Chinese economy less investment-led and help to reduce the current account surplus.

  9. An adjusted energy-saving quantity calculation method for building energy-efficient retrofit

    Institute of Scientific and Technical Information of China (English)

    王清勤; 孟冲

    2009-01-01

    Aiming at a comprehensive assessment of energy-saving retrofitting effect on existing buildings,a calculation method is developed to adjust energy-saving quantity in standard condition for comparison under the same conditions. A mathematical model,method theory and calculation steps are given. Error analysis results show that this method can be applied accurately to practical engineering projects. In a case study of energy-saving quantity assessment before and after retrofitting on a certain hospital in Shanghai,with energy simulation software TRNSYS,detailed application of this method is introduced and analyzed. The method is applied to the case of energy-saving quantity assessment to a hospital in Shanghai before and after retrofitting with the energy simulation software TRNSYS.

  10. Microsoft Office professional 2010 step by step

    CERN Document Server

    Cox, Joyce; Frye, Curtis

    2011-01-01

    Teach yourself exactly what you need to know about using Office Professional 2010-one step at a time! With STEP BY STEP, you build and practice new skills hands-on, at your own pace. Covering Microsoft Word, PowerPoint, Outlook, Excel, Access, Publisher, and OneNote, this book will help you learn the core features and capabilities needed to: Create attractive documents, publications, and spreadsheetsManage your e-mail, calendar, meetings, and communicationsPut your business data to workDevelop and deliver great presentationsOrganize your ideas and notes in one placeConnect, share, and accom

  11. Analysis of Landsat8 satellite remote sensing data preprocessing%Landsat8卫星遥感数据预处理方法

    Institute of Scientific and Technical Information of China (English)

    祝佳

    2016-01-01

    Landsat系列卫星是由美国航空航天局和美国地质调查局共同管理的资源遥感系列卫星,40多a来为地球遥感探测活动提供了大量清晰而稳定的图像数据。卫星遥感数据预处理是获取优质遥感基础图像的第一步,对后续各级卫星遥感产品的质量有着很重要的影响。针对Landsat8卫星原始数据,对卫星下传所采用的空间数据传输协议和数据传输格式进行了详细的解析,分析了原始数据从解同步、数据帧解析、任务数据包解析、图像数据获取直到生成0级图像产品的步骤;特别针对存在无损数据压缩的陆地成像仪( operational land imager,OLI)数据,讨论了基于空间数据系统咨询委员会( consultative committee for space data systems,CCSDS)相关标准进行无损数据解压缩处理的方法和过程。经数据预处理得到的Landsat8卫星0级图像产品,可为Landsat8卫星数据应用提供优质的基础图像。%The Landsat series satellites are the remote sensing resource series satellites, which are jointly managed by National Aeronautics and Space Administration and United States Geological Survey. Large quantities of high-resolution and stable image data provided by the Landsat series satellites have created good opportunities for the earth remote sensing exploration activities in the past forty years. Satellite remote sensing data preprocessing is the first step for obtaining remote sensing image, and has an important impact on the quality of the satellite remote sensing product. Aimed at tackling the Landsat8 raw data, the authors dealt in detail with the space data transmission protocol and data transmission format for Landsat8 data downlink. The preprocessing steps for raw data were analyzed, which included synchronization, transfer frame analyzing, unpack, mission data extracting, etc. In addition, the procedure of 0 - level image product acquisition was described. Specifically, based on CCSDS

  12. Developing Instructional Videotapes Step by Step.

    Science.gov (United States)

    Sweet, Thomas E.

    1990-01-01

    Discusses the eight steps in developing an instructional videotape: planning, brainstorming content, sequencing the storyline, defining the treatment, developing the introduction and conclusion, scripting the video and audio, controlling the production, and specifying the postproduction. (DMM)

  13. Breakthrough Energy Savings with Waterjet Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee W. Saperstein; R. Larry Grayson; David A. Summers; Jorge Garcia-Joo; Greg Sutton; Mike Woodward; T.P. McNulty

    2007-05-15

    Experiments performed at the University of Missouri-Rolla's Waterjet Laboratory have demonstrated clearly the ability of waterjets to disaggregate, in a single step, four different mineral ores, including ores containing iron, lead and copper products. The study focused mainly on galena-bearing dolomite, a lead ore, and compared the new technology with that of traditional mining and milling to liberate the valuable constituent for the more voluminous host rock. The technical term for the disintegration of the ore to achieve this liberation is comminution. The potential for energy savings if this process can be improved, is immense. Further, if this separation can be made at the mining face, then the potential energy savings include avoidance of transportation (haulage and hoisting) costs to move, process and store this waste at the surface. The waste can, instead, be disposed into the available cavities within the mine. The savings also include the elimination of the comminution, crushing and grinding, stages in the processing plant. Future prototype developments are intended to determine if high-pressure waterjet mining and processing can be optimized to become cheaper than traditional fragmentation by drilling and blasting and to optimize the separation process. The basic new mining process was illustrated in tests on two local rock types, a low-strength sandstone with hematite inclusions, and a medium to high-strength dolomite commonly used for construction materials. Illustrative testing of liberation of minerals, utilized a lead-bearing dolomite, and included a parametric study of the optimal conditions needed to create a size distribution considered best for separation. The target goal was to have 50 percent of the mined material finer than 100 mesh (149 microns). Of the 21 tests that were run, five clearly achieved the target. The samples were obtained as run-of-mine lumps of ore, which exhibited a great deal of heterogeneity within the samples. This

  14. Tools and Databases of the KOMICS Web Portal for Preprocessing, Mining, and Dissemination of Metabolomics Data

    Directory of Open Access Journals (Sweden)

    Nozomu Sakurai

    2014-01-01

    Full Text Available A metabolome—the collection of comprehensive quantitative data on metabolites in an organism—has been increasingly utilized for applications such as data-intensive systems biology, disease diagnostics, biomarker discovery, and assessment of food quality. A considerable number of tools and databases have been developed to date for the analysis of data generated by various combinations of chromatography and mass spectrometry. We report here a web portal named KOMICS (The Kazusa Metabolomics Portal, where the tools and databases that we developed are available for free to academic users. KOMICS includes the tools and databases for preprocessing, mining, visualization, and publication of metabolomics data. Improvements in the annotation of unknown metabolites and dissemination of comprehensive metabolomic data are the primary aims behind the development of this portal. For this purpose, PowerGet and FragmentAlign include a manual curation function for the results of metabolite feature alignments. A metadata-specific wiki-based database, Metabolonote, functions as a hub of web resources related to the submitters' work. This feature is expected to increase citation of the submitters' work, thereby promoting data publication. As an example of the practical use of KOMICS, a workflow for a study on Jatropha curcas is presented. The tools and databases available at KOMICS should contribute to enhanced production, interpretation, and utilization of metabolomic Big Data.

  15. Parametric Study to Improve Subpixel Accuracy of Nitric Oxide Tagging Velocimetry with Image Preprocessing

    Directory of Open Access Journals (Sweden)

    Ravi Teja Vedula

    2017-01-01

    Full Text Available Biacetyl phosphorescence has been the commonly used molecular tagging velocimetry (MTV technique to investigate in-cylinder flow evolution and cycle-to-cycle variations in an optical engine. As the phosphorescence of biacetyl tracer deteriorates in the presence of oxygen, nitrogen was adopted as the working medium in the past. Recently, nitrous oxide MTV technique was employed to measure the velocity profile of an air jet. The authors here plan to investigate the potential application of this technique for engine flow studies. A possible experimental setup for this task indicated different permutations of image signal-to-noise ratio (SNR and laser line width. In the current work, a numerical analysis is performed to study the effect of these two factors on displacement error in MTV image processing. Also, several image filtering techniques were evaluated and the performance of selected filters was analyzed in terms of enhancing the image quality and minimizing displacement errors. The flow displacement error without image preprocessing was observed to be inversely proportional to SNR and directly proportional to laser line width. The mean filter resulted in the smallest errors for line widths smaller than 9 pixels. The effect of filter size on subpixel accuracy showed that error levels increased as the filter size increased.

  16. Classifying human voices by using hybrid SFX time-series preprocessing and ensemble feature selection.

    Science.gov (United States)

    Fong, Simon; Lan, Kun; Wong, Raymond

    2013-01-01

    Voice biometrics is one kind of physiological characteristics whose voice is different for each individual person. Due to this uniqueness, voice classification has found useful applications in classifying speakers' gender, mother tongue or ethnicity (accent), emotion states, identity verification, verbal command control, and so forth. In this paper, we adopt a new preprocessing method named Statistical Feature Extraction (SFX) for extracting important features in training a classification model, based on piecewise transformation treating an audio waveform as a time-series. Using SFX we can faithfully remodel statistical characteristics of the time-series; together with spectral analysis, a substantial amount of features are extracted in combination. An ensemble is utilized in selecting only the influential features to be used in classification model induction. We focus on the comparison of effects of various popular data mining algorithms on multiple datasets. Our experiment consists of classification tests over four typical categories of human voice data, namely, Female and Male, Emotional Speech, Speaker Identification, and Language Recognition. The experiments yield encouraging results supporting the fact that heuristically choosing significant features from both time and frequency domains indeed produces better performance in voice classification than traditional signal processing techniques alone, like wavelets and LPC-to-CC.

  17. Pre-Processing of Point-Data from Contact and Optical 3D Digitization Sensors

    Directory of Open Access Journals (Sweden)

    Mirko Soković

    2012-01-01

    Full Text Available Contemporary 3D digitization systems employed by reverse engineering (RE feature ever-growing scanning speeds with the ability to generate large quantity of points in a unit of time. Although advantageous for the quality and efficiency of RE modelling, the huge number of point datas can turn into a serious practical problem, later on, when the CAD model is generated. In addition, 3D digitization processes are very often plagued by measuring errors, which can be attributed to the very nature of measuring systems, various characteristics of the digitized objects and subjective errors by the operator, which also contribute to problems in the CAD model generation process. This paper presents an integral system for the pre-processing of point data, i.e., filtering, smoothing and reduction, based on a cross-sectional RE approach. In the course of the proposed system development, major emphasis was placed on the module for point data reduction, which was designed according to a novel approach with integrated deviation analysis and fuzzy logic reasoning. The developed system was verified through its application on three case studies, on point data from objects of versatile geometries obtained by contact and laser 3D digitization systems. The obtained results demonstrate the effectiveness of the system.

  18. Pre-processing Algorithm for Rectification of Geometric Distortions in Satellite Images

    Directory of Open Access Journals (Sweden)

    Narayan Panigrahi

    2011-02-01

    Full Text Available A number of algorithms have been reported to process and remove geometric distortions in satellite images. Ortho-correction, geometric error correction, radiometric error removal, etc are a few important examples. These algorithm require supplementary meta-information of the satellite images such as ground control points and correspondence, sensor orientation details, elevation profile of the terrain, etc to establish corresponding transformations. In this paper, a pre-processing algorithm has been proposed which removes systematic distortions of a satellite image and thereby removes the blank portion of the image. It is an input-to-output mapping of image pixels, where the transformation computes the coordinate of each output pixel corresponding to the input pixel of an image. The transformation is established by the exact amount of scaling, rotation and translation needed for each pixel in the input image so that the distortion induced during the recording stage is corrected.Defence Science Journal, 2011, 61(2, pp.174-179, DOI:http://dx.doi.org/10.14429/dsj.61.421

  19. A comparative analysis of pre-processing techniques in colour retinal images

    Energy Technology Data Exchange (ETDEWEB)

    Salvatelli, A [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Bizai, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Barbosa, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Drozdowicz, B [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Delrieux, C [Electric and Computing Engineering Department, Universidad Nacional del Sur, Alem 1253, BahIa Blanca, (Partially funded by SECyT-UNS) (Argentina)], E-mail: claudio@acm.org

    2007-11-15

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising.

  20. Tools and databases of the KOMICS web portal for preprocessing, mining, and dissemination of metabolomics data.

    Science.gov (United States)

    Sakurai, Nozomu; Ara, Takeshi; Enomoto, Mitsuo; Motegi, Takeshi; Morishita, Yoshihiko; Kurabayashi, Atsushi; Iijima, Yoko; Ogata, Yoshiyuki; Nakajima, Daisuke; Suzuki, Hideyuki; Shibata, Daisuke

    2014-01-01

    A metabolome--the collection of comprehensive quantitative data on metabolites in an organism--has been increasingly utilized for applications such as data-intensive systems biology, disease diagnostics, biomarker discovery, and assessment of food quality. A considerable number of tools and databases have been developed to date for the analysis of data generated by various combinations of chromatography and mass spectrometry. We report here a web portal named KOMICS (The Kazusa Metabolomics Portal), where the tools and databases that we developed are available for free to academic users. KOMICS includes the tools and databases for preprocessing, mining, visualization, and publication of metabolomics data. Improvements in the annotation of unknown metabolites and dissemination of comprehensive metabolomic data are the primary aims behind the development of this portal. For this purpose, PowerGet and FragmentAlign include a manual curation function for the results of metabolite feature alignments. A metadata-specific wiki-based database, Metabolonote, functions as a hub of web resources related to the submitters' work. This feature is expected to increase citation of the submitters' work, thereby promoting data publication. As an example of the practical use of KOMICS, a workflow for a study on Jatropha curcas is presented. The tools and databases available at KOMICS should contribute to enhanced production, interpretation, and utilization of metabolomic Big Data.

  1. Preprocessing: Geocoding of AVIRIS data using navigation, engineering, DEM, and radar tracking system data

    Science.gov (United States)

    Meyer, Peter; Larson, Steven A.; Hansen, Earl G.; Itten, Klaus I.

    1993-01-01

    Remotely sensed data have geometric characteristics and representation which depend on the type of the acquisition system used. To correlate such data over large regions with other real world representation tools like conventional maps or Geographic Information System (GIS) for verification purposes, or for further treatment within different data sets, a coregistration has to be performed. In addition to the geometric characteristics of the sensor there are two other dominating factors which affect the geometry: the stability of the platform and the topography. There are two basic approaches for a geometric correction on a pixel-by-pixel basis: (1) A parametric approach using the location of the airplane and inertial navigation system data to simulate the observation geometry; and (2) a non-parametric approach using tie points or ground control points. It is well known that the non-parametric approach is not reliable enough for the unstable flight conditions of airborne systems, and is not satisfying in areas with significant topography, e.g. mountains and hills. The present work describes a parametric preprocessing procedure which corrects effects of flight line and attitude variation as well as topographic influences and is described in more detail by Meyer.

  2. A comparative study on preprocessing techniques in diabetic retinopathy retinal images: illumination correction and contrast enhancement.

    Science.gov (United States)

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.

  3. Effect of preprocessing olive storage conditions on virgin olive oil quality and composition.

    Science.gov (United States)

    Inarejos-García, Antonio M; Gómez-Rico, Aurora; Desamparados Salvador, M; Fregapane, Giuseppe

    2010-04-28

    The quality of virgin olive oil (VOO) is intimately related to the characteristics and composition of the olive fruit at the moment of its milling. In this study, the determination of suitable olive storage conditions and feasibility of using this preprocessing operation to modulate the sensory taste of VOO are reported. Several olive batches were stored in different conditions (from monolayer up to 60 cm thickness, at 20 and 10 degrees C) for a period of up to three weeks, and the quality and composition of minor constituents, mainly phenols and volatiles, in the corresponding VOO were monitored. Cornicabra cultivar VOO obtained from drupes stored for 5 or 8 days at 20 or 10 degrees C, respectively, retained the "extra virgin" category, according to chemical quality indices, since only small increases in free acidity and peroxide values were observed, and the bitter index of this monovarietal oil was reduced by 30-40%. Storage under monolayer conditions at 10 degrees C for up to two weeks is also feasible because "off-odor" development was delayed, a 50% reduction in bitterness was obtained, and the overall good quality of the final product was preserved.

  4. An Application for Data Preprocessing and Models Extractions in Web Usage Mining

    Directory of Open Access Journals (Sweden)

    Claudia Elena DINUCA

    2011-11-01

    Full Text Available Web servers worldwide generate a vast amount of information on web users’ browsing activities. Several researchers have studied these so-called clickstream or web access log data to better understand and characterize web users. The goal of this application is to analyze user behaviour by mining enriched web access log data. With the continued growth and proliferation of e-commerce, Web services, and Web-based information systems, the volumes of click stream and user data collected by Web-based organizations in their daily operations has reached astronomical proportions. This information can be exploited in various ways, such as enhancing the effectiveness of websites or developing directed web marketing campaigns. The discovered patterns are usually represented as collections of pages, objects, or re-sources that are frequently accessed by groups of users with common needs or interests. In this paper we will focus on displaying the way how it was implemented the application for data preprocessing and extracting different data models from web logs data, finding association as a data mining technique to extract potentially useful knowledge from web usage data. We find different data models navigation patterns by analysing the log files of the web-site. I implemented the application in Java using NetBeans IDE. For exemplification, I used the log files data from a commercial web site www.nice-layouts.com.

  5. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  6. Robust preprocessing for stimulus-based functional MRI of the moving fetus.

    Science.gov (United States)

    You, Wonsang; Evangelou, Iordanis E; Zun, Zungho; Andescavage, Nickie; Limperopoulos, Catherine

    2016-04-01

    Fetal motion manifests as signal degradation and image artifact in the acquired time series of blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) studies. We present a robust preprocessing pipeline to specifically address fetal and placental motion-induced artifacts in stimulus-based fMRI with slowly cycled block design in the living fetus. In the proposed pipeline, motion correction is optimized to the experimental paradigm, and it is performed separately in each phase as well as in each region of interest (ROI), recognizing that each phase and organ experiences different types of motion. To obtain the averaged BOLD signals for each ROI, both misaligned volumes and noisy voxels are automatically detected and excluded, and the missing data are then imputed by statistical estimation based on local polynomial smoothing. Our experimental results demonstrate that the proposed pipeline was effective in mitigating the motion-induced artifacts in stimulus-based fMRI data of the fetal brain and placenta.

  7. The Impact of the Preprocessing Methods in Downstream Analysis of Agilent Microarray Data

    Directory of Open Access Journals (Sweden)

    Loredana BĂLĂCESCU

    2015-12-01

    Full Text Available Over the past decades, gene expression microarrays have been used extensively in biomedical research. However, these high-throughput experiments are affected by technical variation and biases introduced at different levels, such as mRNA processing, labeling, hybridization, scanning and/or imaging. Therefore, data preprocessing is important to minimize these systematic errors in order to identify actual biological changes. The aim of this study was to compare all possible combinations of two normalization, four summarization, and two background correction options, using two different foreground estimates. The results shows that the background correction of the raw median signal and summarization methods used here have no impact in downstream analysis. In contrast, the choice of the normalization method influences the results; the quantile normalization leading to a better biological sensitivity of the data. When Agilent processed signal was considered, regardless of the summarization and normalization options, there were consistently identified more differentially expressed genes (DEG than when raw median signal was used. Nevertheless, the greater number of DEG didn’t result in an improvement of the biological relevance.

  8. Saving Tons at the Register

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Karl; Seigel, Jeff; Sherman, Max; Walker, Iain

    1998-05-01

    Duct losses have a significant effect on the efficiency of delivering space cooling to U.S. homes. This effect is especially dramatic during peak demand periods where half of the cooling equipment's output can be wasted. Improving the efficiency of a duct system can save energy, but can also allow for downsizing of cooling equipment without sacrificing comfort conditions. Comfort, and hence occupant acceptability, is determined not only by steady-state temperatures, but by how long it takes to pull down the temperature during cooling start-up, such as when the occupants come home on a hot summer afternoon. Thus the delivered tons of cooling at the register during start-up conditions are critical to customer acceptance of equipment downsizing strategies. We have developed a simulation technique which takes into account such things as weather, heat-transfer (including hot attic conditions), airflow, duct tightness, duct location and insulation, and cooling equipment performance to determine the net tons of cooling delivered to occupied space. Capacity at the register has been developed as an improvement over equipment tonnage as a system sizing measure. We use this concept to demonstrate that improved ducts and better system installation is as important as equipment size, with analysis of pull-down capability as a proxy for comfort. The simulations indicate that an improved system installation including tight ducts can eliminate the need for almost a ton of rated equipment capacity in a typical new 2,000 square foot house in Sacramento, California. Our results have also shown that a good duct system can reduce capacity requirements and still provide equivalent cooling at start-up and at peak conditions.

  9. Can this merger be saved?

    Science.gov (United States)

    Cliffe, S

    1999-01-01

    In this fictional case study, a merger that looked like a marriage made in heaven to those at corporate headquarters is feeling like an infernal union to those on the ground. The merger is between Synergon Capital, a U.S. financial-services behemoth, and Beauchamp, Becker & Company, a venerable British financial-services company with strong profits and an extraordinarily loyal client base of wealthy individuals. Beauchamp also boasts a strong group of senior managers led by Julian Mansfield, a highly cultured and beloved patriarch who personifies all that's good about the company. Synergon isn't accustomed to acquiring such companies. It usually encircles a poorly managed turnaround candidate and then, once the deal is done, drops a neutron bomb on it, leaving file cabinets and contracts but no people. Before acquiring Beauchamp, Synergon's macho men offered loud assurances that they would leave the tradition-bound company alone-provided, of course, that Beauchamp met the ambitious target numbers and showed sufficient enthusiasm for cross-selling Synergon's products to its wealthy clients. In charge of making the acquisition work is Nick Cunningham, one of Synergon's more thoughtful executives. Nick, who was against the deal from the start, is the face and voice of Synergon for Julian Mansfield. And Mansfield, in his restrained way, is angry at the constant flow of bureaucratic forms, at the rude demands for instant information, at the peremptory changes. He's even dropping broad hints at retirement. Nick has already been warned: if Mansfield goes, you go. Six commentators advise Nick on how to save his job by bringing peace and prosperity to the feuding couple.

  10. Saving can save from death anxiety: mortality salience and financial decision-making.

    Directory of Open Access Journals (Sweden)

    Tomasz Zaleskiewicz

    Full Text Available Four studies tested the idea that saving money can buffer death anxiety and constitute a more effective buffer than spending money. Saving can relieve future-related anxiety and provide people with a sense of control over their fate, thereby rendering death thoughts less threatening. Study 1 found that participants primed with both saving and spending reported lower death fear than controls. Saving primes, however, were associated with significantly lower death fear than spending primes. Study 2 demonstrated that mortality primes increase the attractiveness of more frugal behaviors in save-or-spend dilemmas. Studies 3 and 4 found, in two different cultures (Polish and American, that the activation of death thoughts prompts people to allocate money to saving as opposed to spending. Overall, these studies provided evidence that saving protects from existential anxiety, and probably more so than spending.

  11. Saving can save from death anxiety: mortality salience and financial decision-making.

    Science.gov (United States)

    Zaleskiewicz, Tomasz; Gasiorowska, Agata; Kesebir, Pelin

    2013-01-01

    Four studies tested the idea that saving money can buffer death anxiety and constitute a more effective buffer than spending money. Saving can relieve future-related anxiety and provide people with a sense of control over their fate, thereby rendering death thoughts less threatening. Study 1 found that participants primed with both saving and spending reported lower death fear than controls. Saving primes, however, were associated with significantly lower death fear than spending primes. Study 2 demonstrated that mortality primes increase the attractiveness of more frugal behaviors in save-or-spend dilemmas. Studies 3 and 4 found, in two different cultures (Polish and American), that the activation of death thoughts prompts people to allocate money to saving as opposed to spending. Overall, these studies provided evidence that saving protects from existential anxiety, and probably more so than spending.

  12. Step by step: Revisiting step tolling in the bottleneck model

    NARCIS (Netherlands)

    Lindsey, C.R.; Berg, van den V.A.C.; Verhoef, E.T.

    2010-01-01

    In most dynamic traffic congestion models, congestion tolls must vary continuously over time to achieve the full optimum. This is also the case in Vickrey's (1969) 'bottleneck model'. To date, the closest approximations of this ideal in practice have so-called 'step tolls', in which the toll takes o

  13. Electric energy savings from new technologies

    Energy Technology Data Exchange (ETDEWEB)

    Moe, R.J.; Harrer, B.J.; Kellogg, M.A.; Lyke, A.J.; Imhoff, K.L.; Fisher, Z.J.

    1986-01-01

    Purpose of the report is to provide information about the electricity-saving potential of new technologies to OCEP that it can use in developing alternative long-term projections of US electricity consumption. Low-, base-, and high-case scenarios of the electricity savings for ten technologies were prepared. The total projected annual savings for the year 2000 for all ten technologies were 137 billion kilowatt hours (BkWh), 279 BkWh, and 470 BkWh, respectively, for the three cases. The magnitude of these savings projections can be gauged by comparing them to the Department's reference case projection for the 1985 National Energy Policy Plan. In the Department's reference case, total consumption in 2000 is projected to be 3319 BkWh. Thus, the savings projected here represent between 4% and 14% of total consumption projected for 2000. Because approximately 75% of the base-case estimate of savings are already incorporated into the reference forecast, reducing projected electricity consumption from what it otherwise would have been, the savings estimated here should not be directly subtracted from the reference forecast.

  14. Estimating customer electricity savings from projects installed by the U.S. ESCO industry

    Energy Technology Data Exchange (ETDEWEB)

    Carvallo, Juan Pablo [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Larsen, Peter H. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2014-11-25

    The U.S. energy service company (ESCO) industry has a well-established track record of delivering substantial energy and dollar savings in the public and institutional facilities sector, typically through the use of energy savings performance contracts (ESPC) (Larsen et al. 2012; Goldman et al. 2005; Hopper et al. 2005, Stuart et al. 2013). This ~$6.4 billion industry, which is expected to grow significantly over the next five years, may play an important role in achieving demand-side energy efficiency under local/state/federal environmental policy goals. To date, there has been little or no research in the public domain to estimate electricity savings for the entire U.S. ESCO industry. Estimating these savings levels is a foundational step in order to determine total avoided greenhouse gas (GHG) emissions from demand-side energy efficiency measures installed by U.S. ESCOs. We introduce a method to estimate the total amount of electricity saved by projects implemented by the U.S. ESCO industry using the Lawrence Berkeley National Laboratory (LBNL) /National Association of Energy Service Companies (NAESCO) database of projects and LBNL’s biennial industry survey. We report two metrics: incremental electricity savings and savings from ESCO projects that are active in a given year (e.g., 2012). Overall, we estimate that in 2012 active U.S. ESCO industry projects generated about 34 TWh of electricity savings—15 TWh of these electricity savings were for MUSH market customers who did not rely on utility customer-funded energy efficiency programs (see Figure 1). This analysis shows that almost two-thirds of 2012 electricity savings in municipal, local and state government facilities, universities/colleges, K-12 schools, and healthcare facilities (i.e., the so-called “MUSH” market) were not supported by a utility customer-funded energy efficiency program.

  15. Potentials for Heat Savings in Greater Copenhagen

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen; Karlsson, Kenneth

    1998-01-01

    are suggested. Two scenarios for future heat savings are established, deviating in the rates of renovation, demolition, and construction of buildings, as well as in the thermal insulation standards, ventilation systems ,and in the daily behaviour. The results are that compared to the base year 1995, heat......This report describes methodologies for analysing heat saving potentials. The background for the lack of activities in that field is suggested. Various elements of heat savings are described, including changes in daily behaviour and life styles. Definition of various levels of potentials...

  16. Household water saving: Evidence from Spain

    Science.gov (United States)

    Aisa, Rosa; Larramona, Gemma

    2012-12-01

    This article focuses on household water use in Spain by analyzing the influence of a detailed set of factors. We find that, although the presence of both water-saving equipment and water-conservation habits leads to water savings, the factors that influence each are not the same. In particular, our results show that those individuals most committed to the adoption of water-saving equipment and, at the same time, less committed to water-conservation habits tend to have higher incomes.

  17. Energy Savings Performance Contracts (ESPC): FEMP Assistance

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-11-01

    An ESPC is a working relationship between a Federal agency and an energy service company (ESCO). The ESCO conducts a comprehensive energy audit for the Federal facility and identifies improvements to save energy. In consultation with the Federal agency, the ESCO designs and constructs a project that meets the agency’s needs and arranges the necessary funding. The ESCO guarantees the improvements will generate energy cost savings sufficient to pay for the project over the term of the contract. After the contract ends, all additional cost savings accrue to the agency.

  18. Electricity Saving Relies on Mechanism Innovation

    Institute of Scientific and Technical Information of China (English)

    Ye Chun; Ye Qing

    2007-01-01

    @@ Energy saving is a vital important issue concerning sustainable development of social economy including national economic security, international competitiveness, resources conservation and environmental protection. Energy saving has now become one of the most important national strategy of China in the 21st Century. The Author believes, mechanism innovation to realize new type of management system, that is, to stipulate quota and efficiency standards of power utilization,to set up incentive and restrictive mechanism taking electricity price as a leverage so as to intensify the leading role of electricity price on power demand that electricity saving needs.

  19. A Novel Semantically-Time-Referrer based Approach of Web Usage Mining for Improved Sessionization in Pre-Processing of Web Log

    Directory of Open Access Journals (Sweden)

    Navjot Kaur

    2017-01-01

    Full Text Available Web usage mining(WUM , also known as Web Log Mining is the application of Data Mining techniques, which are applied on large volume of data to extract useful and interesting user behaviour patterns from web logs, in order to improve web based applications. This paper aims to improve the data discovery by mining the usage data from log files. In this paper the work is done in three phases. First and second phase0 which are data cleaning and user identification respectively are completed using traditional methods. The third phase, session identification is done using three different methods. The main focus of this paper is on sessionization of log file which is a critical step for extracting usage patterns. The proposed referrer-time and Semantically-time-referrer methods overcome the limitations of traditional methods. The main advantage of pre-processing model presented in this paper over other methods is that it can process text or excel log file of any format. The experiments are performed on three different log files which indicate that the proposed semantically-time-referrer based heuristic approach achieves better results than the traditional time and Referrer-time based methods. The proposed methods are not complex to use. Web log file is collected from different servers and contains the public information of visitors. In addition, this paper also discusses different types of web log formats.

  20. Optimized data preprocessing for multivariate analysis applied to 99mTc-ECD SPECT data sets of Alzheimer's patients and asymptomatic controls.

    Science.gov (United States)

    Merhof, Dorit; Markiewicz, Pawel J; Platsch, Günther; Declerck, Jerome; Weih, Markus; Kornhuber, Johannes; Kuwert, Torsten; Matthews, Julian C; Herholz, Karl

    2011-01-01

    Multivariate image analysis has shown potential for classification between Alzheimer's disease (AD) patients and healthy controls with a high-diagnostic performance. As image analysis of positron emission tomography (PET) and single photon emission computed tomography (SPECT) data critically depends on appropriate data preprocessing, the focus of this work is to investigate the impact of data preprocessing on the outcome of the analysis, and to identify an optimal data preprocessing method. In this work, technetium-99methylcysteinatedimer ((99m)Tc-ECD) SPECT data sets of 28 AD patients and 28 asymptomatic controls were used for the analysis. For a series of different data preprocessing methods, which includes methods for spatial normalization, smoothing, and intensity normalization, multivariate image analysis based on principal component analysis (PCA) and Fisher discriminant analysis (FDA) was applied. Bootstrap resampling was used to investigate the robustness of the analysis and the classification accuracy, depending on the data preprocessing method. Depending on the combination of preprocessing methods, significant differences regarding the classification accuracy were observed. For (99m)Tc-ECD SPECT data, the optimal data preprocessing method in terms of robustness and classification accuracy is based on affine registration, smoothing with a Gaussian of 12 mm full width half maximum, and intensity normalization based on the 25% brightest voxels within the whole-brain region.

  1. Quality assessment of baby food made of different pre-processed organic raw materials under industrial processing conditions.

    Science.gov (United States)

    Seidel, Kathrin; Kahl, Johannes; Paoletti, Flavio; Birlouez, Ines; Busscher, Nicolaas; Kretzschmar, Ursula; Särkkä-Tirkkonen, Marjo; Seljåsen, Randi; Sinesio, Fiorella; Torp, Torfinn; Baiamonte, Irene

    2015-02-01

    The market for processed food is rapidly growing. The industry needs methods for "processing with care" leading to high quality products in order to meet consumers' expectations. Processing influences the quality of the finished product through various factors. In carrot baby food, these are the raw material, the pre-processing and storage treatments as well as the processing conditions. In this study, a quality assessment was performed on baby food made from different pre-processed raw materials. The experiments were carried out under industrial conditions using fresh, frozen and stored organic carrots as raw material. Statistically significant differences were found for sensory attributes among the three autoclaved puree samples (e.g. overall odour F = 90.72, p food.

  2. lop-DWI: A Novel Scheme for Pre-Processing of Diffusion Weighted Images in the Gradient Direction Domain

    Directory of Open Access Journals (Sweden)

    Farshid eSepehrband

    2015-01-01

    Full Text Available We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fibre tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fibre tracks was significantly improved using our method.

  3. lop-DWI: A Novel Scheme for Pre-Processing of Diffusion-Weighted Images in the Gradient Direction Domain.

    Science.gov (United States)

    Sepehrband, Farshid; Choupan, Jeiran; Caruyer, Emmanuel; Kurniawan, Nyoman D; Gal, Yaniv; Tieng, Quang M; McMahon, Katie L; Vegh, Viktor; Reutens, David C; Yang, Zhengyi

    2014-01-01

    We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion-gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo-periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fiber tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fiber tracks was significantly improved using our method.

  4. PRACTICAL RECOMMENDATIONS OF DATA PREPROCESSING AND GEOSPATIAL MEASURES FOR OPTIMIZING THE NEUROLOGICAL AND OTHER PEDIATRIC EMERGENCIES MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ionela MANIU

    2017-08-01

    Full Text Available Time management, optimal and timed determination of emergency severity as well as optimizing the use of available human and material resources are crucial areas of emergency services. A starting point for achieving these optimizations can be considered the analysis and preprocess of real data from the emergency services. The benefits of performing this method consist in exposing more useful structures to data modelling algorithms which consequently will reduce overfitting and improves accuracy. This paper aims to offer practical recommendations for data preprocessing measures including feature selection and discretization of numeric attributes regarding age, duration of the case, season, period, week period (workday, weekend and geospatial location of neurological and other pediatric emergencies. An analytical, retrospective study was conducted on a sample consisting of 933 pediatric cases, from UPU-SMURD Sibiu, 01.01.2014 – 27.02.2017 period.

  5. Effects of different correlation metrics and preprocessing factors on small-world brain functional networks: a resting-state functional MRI study.

    Directory of Open Access Journals (Sweden)

    Xia Liang

    Full Text Available Graph theoretical analysis of brain networks based on resting-state functional MRI (R-fMRI has attracted a great deal of attention in recent years. These analyses often involve the selection of correlation metrics and specific preprocessing steps. However, the influence of these factors on the topological properties of functional brain networks has not been systematically examined. Here, we investigated the influences of correlation metric choice (Pearson's correlation versus partial correlation, global signal presence (regressed or not and frequency band selection [slow-5 (0.01-0.027 Hz versus slow-4 (0.027-0.073 Hz] on the topological properties of both binary and weighted brain networks derived from them, and we employed test-retest (TRT analyses for further guidance on how to choose the "best" network modeling strategy from the reliability perspective. Our results show significant differences in global network metrics associated with both correlation metrics and global signals. Analysis of nodal degree revealed differing hub distributions for brain networks derived from Pearson's correlation versus partial correlation. TRT analysis revealed that the reliability of both global and local topological properties are modulated by correlation metrics and the global signal, with the highest reliability observed for Pearson's-correlation-based brain networks without global signal removal (WOGR-PEAR. The nodal reliability exhibited a spatially heterogeneous distribution wherein regions in association and limbic/paralimbic cortices showed moderate TRT reliability in Pearson's-correlation-based brain networks. Moreover, we found that there were significant frequency-related differences in topological properties of WOGR-PEAR networks, and brain networks derived in the 0.027-0.073 Hz band exhibited greater reliability than those in the 0.01-0.027 Hz band. Taken together, our results provide direct evidence regarding the influences of correlation metrics

  6. A signal pre-processing algorithm designed for the needs of hardware implementation of neural classifiers used in condition monitoring

    DEFF Research Database (Denmark)

    Dabrowski, Dariusz; Hashemiyan, Zahra; Adamczyk, Jan

    2015-01-01

    and bucket wheel excavators. In this paper, a signal pre-processing algorithm designed for condition monitoring of planetary gears working in non-stationary operation is presented. The algorithm is dedicated for hardware implementation on Field Programmable Gate Arrays (FPGAs). The purpose of the algorithm......%, it can be performed in real-time conditions and its implementation does not require many resources of FPGAs....

  7. The Effect of LC-MS Data Preprocessing Methods on the Selection of Plasma Biomarkers in Fed vs. Fasted Rats.

    Science.gov (United States)

    Gürdeniz, Gözde; Kristensen, Mette; Skov, Thomas; Dragsted, Lars O

    2012-01-18

    The metabolic composition of plasma is affected by time passed since the last meal and by individual variation in metabolite clearance rates. Rat plasma in fed and fasted states was analyzed with liquid chromatography quadrupole-time-of-flight mass spectrometry (LC-QTOF) for an untargeted investigation of these metabolite patterns. The dataset was used to investigate the effect of data preprocessing on biomarker selection using three different softwares, MarkerLynxTM, MZmine, XCMS along with a customized preprocessing method that performs binning of m/z channels followed by summation through retention time. Direct comparison of selected features representing the fed or fasted state showed large differences between the softwares. Many false positive markers were obtained from custom data preprocessing compared with dedicated softwares while MarkerLynxTM provided better coverage of markers. However, marker selection was more reliable with the gap filling (or peak finding) algorithms present in MZmine and XCMS. Further identification of the putative markers revealed that many of the differences between the markers selected were due to variations in features representing adducts or daughter ions of the same metabolites or of compounds from the same chemical subclasses, e.g., lyso-phosphatidylcholines (LPCs) and lyso-phosphatidylethanolamines (LPEs). We conclude that despite considerable differences in the performance of the preprocessing tools we could extract the same biological information by any of them. Carnitine, branched-chain amino acids, LPCs and LPEs were identified by all methods as markers of the fed state whereas acetylcarnitine was abundant during fasting in rats.

  8. HapMap filter 1.0: A tool to preprocess the HapMap genotypic data for association studies

    OpenAIRE

    2008-01-01

    The International HapMap Project provides a resource of genotypic data on single nucleotide polymorphisms (SNPs), which can be used in various association studies to identify the genetic determinants for phenotypic variations. Prior to the association studies, the HapMap dataset should be preprocessed in order to reduce the computation time and control the multiple testing problem. The less informative SNPs including those with very low genotyping rate and SNPs with rare minor allele frequenc...

  9. BGP Ltd Adopts Energy-saving Technology

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    @@ An exploration subsidiary of China National Petroleum Corporation (CNPC), the country's largest oil company, has agreed to use energy-saving technology developed by a Beijing firm in an attempt to slash costs.

  10. USAID IT Reform Cost Savings/Avoidance

    Data.gov (United States)

    US Agency for International Development — The Office of the Chief Information Officer in the Management Bureau of USAID launched initiatives designed for IT cost savings and avoidance. This dataset includes...

  11. The School Advanced Ventilation Engineering Software (SAVES)

    Science.gov (United States)

    The School Advanced Ventilation Engineering Software (SAVES) package is a tool to help school designers assess the potential financial payback and indoor humidity control benefits of Energy Recovery Ventilation (ERV) systems for school applications.

  12. Statistical Uncertainty in the Medicare Shared Savings...

    Data.gov (United States)

    U.S. Department of Health & Human Services — According to analysis reported in Statistical Uncertainty in the Medicare Shared Savings Program published in Volume 2, Issue 4 of the Medicare and Medicaid Research...

  13. EFFECTIVE SAVINGS IN PRODUCTION TIMES AND COST

    African Journals Online (AJOL)

    ES OBE

    THROUGH MONITORING OF THE EFFECT OF ADDITIVES AND ... planning, using the in-depth knowledge of gel times, can there be a saving in production times and prevention of material ..... production lines staff handlay-up laminators and ...

  14. GSA IT Reform Cost Savings/Avoidance

    Data.gov (United States)

    General Services Administration — GSA IT provides data related to Agency IT initiatives that save or avoid expenditures. This data is provided as a requirement of OMB's Integrated Data Collection...

  15. Must losing taxes on saving be harmful?

    DEFF Research Database (Denmark)

    Huizinga, Harry; Nielsen, Søren Bo

    2004-01-01

    are financed by taxes on savingand investment. There is international cross-ownership of firms, and countries areassumed to be unable to tax away pure profits. Countries then face an incentiveto impose a rather high investment tax also borne by foreigners. In this setting,the loss of the saving tax instrument...... on account of international tax evasion mayprevent the overall saving-investment tax wedge from becoming too high, and hencemay be beneficial for moderate preferences for public goods. A world with 'high-spending' governments, in contrast, is made worse off by the loss of saving taxes,and hence stands...... to gain from international cooperation to restore saving taxation.JEL-Classifcation: H87, H21Keywords: Capital income taxation, cross-ownership, coordination...

  16. Energy Savings Thanks to French Textile Machinery

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    The French Textile Machinery Manufacturers’ Association (UCMTF) has presented,during a seminar it organized for textile professionals and students,the spectacular energy savings achieved thanks to state of the art

  17. Energy Savings Thanks to French Textile Machinery

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    @@ The French Textile Machinery Manufacturers' Association (UCMTF) has presented, during a seminar it organized for textile professionals and students, the spectacular energy savings achieved thanks to state of the art machinery.

  18. Glaucoma: Screening Can Save Your Sight!

    Science.gov (United States)

    ... of this page please turn Javascript on. Feature: Glaucoma Glaucoma: Screening Can Save Your Sight! Past Issues / Fall 2009 Table of Contents People with glaucoma see the world through a tunnel. Glaucoma is ...

  19. Step by Step Microsoft Office Visio 2003

    CERN Document Server

    Lemke, Judy

    2004-01-01

    Experience learning made easy-and quickly teach yourself how to use Visio 2003, the Microsoft Office business and technical diagramming program. With STEP BY STEP, you can take just the lessons you need, or work from cover to cover. Either way, you drive the instruction-building and practicing the skills you need, just when you need them! Produce computer network diagrams, organization charts, floor plans, and moreUse templates to create new diagrams and drawings quicklyAdd text, color, and 1-D and 2-D shapesInsert graphics and pictures, such as company logosConnect shapes to create a basic f

  20. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2012-12-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing methods used in a hand gesture recognition system. The pre-processing methods are based on the combinations of several image processing operations, namely edge detection, low pass filtering, histogram equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possible classes. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  1. Novel low-power ultrasound digital preprocessing architecture for wireless display.

    Science.gov (United States)

    Levesque, Philippe; Sawan, Mohamad

    2010-03-01

    A complete hardware-based ultrasound preprocessing unit (PPU) is presented as an alternative to available power-hungry devices. Intended to expand the ultrasonic applications, the proposed unit allows replacement of the cable of the ultrasonic probe by a wireless link to transfer data from the probe to a remote monitor. The digital back-end architecture of this PPU is fully pipelined, which permits sampling of ultrasonic signals at a frequency equal to the field-programmable gate array-based system clock, up to 100 MHz. Experimental results show that the proposed processing unit has an excellent performance, an equivalent 53.15 Dhrystone 2.1 MIPS/ MHz (DMIPS/MHz), compared with other software-based architectures that allow a maximum of 1.6 DMIPS/MHz. In addition, an adaptive subsampling method is proposed to operate the pixel compressor, which allows real-time image zooming and, by removing high-frequency noise, the lateral and axial resolutions are enhanced by 25% and 33%, respectively. Realtime images, acquired from a reference phantom, validated the feasibility of the proposed architecture. For a display rate of 15 frames per second, and a 5-MHz single-element piezoelectric transducer, the proposed digital PPU requires a dynamic power of only 242 mW, which represents around 20% of the best-available software-based system. Furthermore, composed by the ultrasound processor and the image interpolation unit, the digital processing core of the PPU presents good power-performance ratios of 26 DMIPS/mW and 43.9 DMIPS/mW at a 20-MHz and 100-MHz sample frequency, respectively.

  2. Functional MRI preprocessing in lesioned brains: manual versus automated region of interest analysis

    Directory of Open Access Journals (Sweden)

    Kathleen A Garrison

    2015-09-01

    Full Text Available Functional magnetic resonance imaging has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant’s structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant’s non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise but may provide a more accurate estimate of brain response. In this study, we directly compare commonly used automated and manual approaches to ROI analysis by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. We found a significant difference in task-related effect size and percent activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design.

  3. MeteoIO 2.4.2: a preprocessing library for meteorological data

    Directory of Open Access Journals (Sweden)

    M. Bavay

    2014-12-01

    Full Text Available Using numerical models which require large meteorological data sets is sometimes difficult and problems can often be traced back to the Input/Output functionality. Complex models are usually developed by the environmental sciences community with a focus on the core modelling issues. As a consequence, the I/O routines that are costly to properly implement are often error-prone, lacking flexibility and robustness. With the increasing use of such models in operational applications, this situation ceases to be simply uncomfortable and becomes a major issue. The MeteoIO library has been designed for the specific needs of numerical models that require meteorological data. The whole task of data preprocessing has been delegated to this library, namely retrieving, filtering and resampling the data if necessary as well as providing spatial interpolations and parameterizations. The focus has been to design an Application Programming Interface (API that (i provides a uniform interface to meteorological data in the models, (ii hides the complexity of the processing taking place, and (iii guarantees a robust behaviour in the case of format errors, erroneous or missing data. Moreover, in an operational context, this error handling should avoid unnecessary interruptions in the simulation process. A strong emphasis has been put on simplicity and modularity in order to make it extremely easy to support new data formats or protocols and to allow contributors with diverse backgrounds to participate. This library is also regularly evaluated for computing performance and further optimized where necessary. Finally, it is released under an Open Source license and is available at http://models.slf.ch/p/meteoio. This paper gives an overview of the MeteoIO library from the point of view of conceptual design, architecture, features and computational performance. A scientific evaluation of the produced results is not given here since the scientific algorithms that are used

  4. Hardware Design and Implementation of a Wavelet De-Noising Procedure for Medical Signal Preprocessing

    Directory of Open Access Journals (Sweden)

    Szi-Wen Chen

    2015-10-01

    Full Text Available In this paper, a discrete wavelet transform (DWT based de-noising with its applications into the noise reduction for medical signal preprocessing is introduced. This work focuses on the hardware realization of a real-time wavelet de-noising procedure. The proposed de-noising circuit mainly consists of three modules: a DWT, a thresholding, and an inverse DWT (IDWT modular circuits. We also proposed a novel adaptive thresholding scheme and incorporated it into our wavelet de-noising procedure. Performance was then evaluated on both the architectural designs of the software and. In addition, the de-noising circuit was also implemented by downloading the Verilog codes to a field programmable gate array (FPGA based platform so that its ability in noise reduction may be further validated in actual practice. Simulation experiment results produced by applying a set of simulated noise-contaminated electrocardiogram (ECG signals into the de-noising circuit showed that the circuit could not only desirably meet the requirement of real-time processing, but also achieve satisfactory performance for noise reduction, while the sharp features of the ECG signals can be well preserved. The proposed de-noising circuit was further synthesized using the Synopsys Design Compiler with an Artisan Taiwan Semiconductor Manufacturing Company (TSMC, Hsinchu, Taiwan 40 nm standard cell library. The integrated circuit (IC synthesis simulation results showed that the proposed design can achieve a clock frequency of 200 MHz and the power consumption was only 17.4 mW, when operated at 200 MHz.

  5. Detection of epileptic seizure in EEG signals using linear least squares preprocessing.

    Science.gov (United States)

    Roshan Zamir, Z

    2016-09-01

    An epileptic seizure is a transient event of abnormal excessive neuronal discharge in the brain. This unwanted event can be obstructed by detection of electrical changes in the brain that happen before the seizure takes place. The automatic detection of seizures is necessary since the visual screening of EEG recordings is a time consuming task and requires experts to improve the diagnosis. Much of the prior research in detection of seizures has been developed based on artificial neural network, genetic programming, and wavelet transforms. Although the highest achieved accuracy for classification is 100%, there are drawbacks, such as the existence of unbalanced datasets and the lack of investigations in performances consistency. To address these, four linear least squares-based preprocessing models are proposed to extract key features of an EEG signal in order to detect seizures. The first two models are newly developed. The original signal (EEG) is approximated by a sinusoidal curve. Its amplitude is formed by a polynomial function and compared with the predeveloped spline function. Different statistical measures, namely classification accuracy, true positive and negative rates, false positive and negative rates and precision, are utilised to assess the performance of the proposed models. These metrics are derived from confusion matrices obtained from classifiers. Different classifiers are used over the original dataset and the set of extracted features. The proposed models significantly reduce the dimension of the classification problem and the computational time while the classification accuracy is improved in most cases. The first and third models are promising feature extraction methods with the classification accuracy of 100%. Logistic, LazyIB1, LazyIB5, and J48 are the best classifiers. Their true positive and negative rates are 1 while false positive and negative rates are 0 and the corresponding precision values are 1. Numerical results suggest that these

  6. New supervised alignment method as a preprocessing tool for chromatographic data in metabolomic studies.

    Science.gov (United States)

    Struck, Wiktoria; Wiczling, Paweł; Waszczuk-Jankowska, Małgorzata; Kaliszan, Roman; Markuszewski, Michał Jan

    2012-09-21

    The purpose of this work was to develop a new aligning algorithm called supervised alignment and to compare its performance with the correlation optimized warping. The supervised alignment is based on a "supervised" selection of a few common peaks presented on each chromatogram. The selected peaks are aligned based on a difference in the retention time of the selected analytes in the sample and the reference chromatogram. The retention times of the fragments between known peaks are subsequently linearly interpolated. The performance of the proposed algorithm has been tested on a series of simulated and experimental chromatograms. The simulated chromatograms comprised analytes with a systematic or random retention time shifts. The experimental chromatographic (RP-HPLC) data have been obtained during the analysis of nucleosides from 208 urine samples and consists of both the systematic and random displacements. All the data sets have been aligned using the correlation optimized warping and the supervised alignment. The time required to complete the alignment, the overall complexity of both algorithms, and its performance measured by the average correlation coefficients are compared to assess performance of tested methods. In the case of systematic shifts, both methods lead to the successful alignment. However, for random shifts, the correlation optimized warping in comparison to the supervised alignment requires more time (few hours versus few minutes) and the quality of the alignment described as correlation coefficient of the newly aligned matrix is worse 0.8593 versus 0.9629. For the experimental dataset supervised alignment successfully aligns 208 samples using 10 prior identified peaks. The knowledge about retention times of few analytes' in the data sets is necessary to perform the supervised alignment for both systematic and random shifts. The supervised alignment method is faster, more effective and simpler preprocessing method than the correlation optimized

  7. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis.

    Science.gov (United States)

    Garrison, Kathleen A; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J; Aziz-Zadeh, Lisa S

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant's structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant's non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design.

  8. A non-linear preprocessing for opto-digital image encryption using multiple-parameter discrete fractional Fourier transform

    Science.gov (United States)

    Azoug, Seif Eddine; Bouguezel, Saad

    2016-01-01

    In this paper, a novel opto-digital image encryption technique is proposed by introducing a new non-linear preprocessing and using the multiple-parameter discrete fractional Fourier transform (MPDFrFT). The non-linear preprocessing is performed digitally on the input image in the spatial domain using a piecewise linear chaotic map (PLCM) coupled with the bitwise exclusive OR (XOR). The resulting image is multiplied by a random phase mask before applying the MPDFrFT to whiten the image. Then, a chaotic permutation is performed on the output of the MPDFrFT using another PLCM different from the one used in the spatial domain. Finally, another MPDFrFT is applied to obtain the encrypted image. The parameters of the PLCMs together with the multiple fractional orders of the MPDFrFTs constitute the secret key for the proposed cryptosystem. Computer simulation results and security analysis are presented to show the robustness of the proposed opto-digital image encryption technique and the great importance of the new non-linear preprocessing introduced to enhance the security of the cryptosystem and overcome the problem of linearity encountered in the existing permutation-based opto-digital image encryption schemes.

  9. Design of radial basis function neural network classifier realized with the aid of data preprocessing techniques: design and analysis

    Science.gov (United States)

    Oh, Sung-Kwun; Kim, Wook-Dong; Pedrycz, Witold

    2016-05-01

    In this paper, we introduce a new architecture of optimized Radial Basis Function neural network classifier developed with the aid of fuzzy clustering and data preprocessing techniques and discuss its comprehensive design methodology. In the preprocessing part, the Linear Discriminant Analysis (LDA) or Principal Component Analysis (PCA) algorithm forms a front end of the network. The transformed data produced here are used as the inputs of the network. In the premise part, the Fuzzy C-Means (FCM) algorithm determines the receptive field associated with the condition part of the rules. The connection weights of the classifier are of functional nature and come as polynomial functions forming the consequent part. The Particle Swarm Optimization algorithm optimizes a number of essential parameters needed to improve the accuracy of the classifier. Those optimized parameters include the type of data preprocessing, the dimensionality of the feature vectors produced by the LDA (or PCA), the number of clusters (rules), the fuzzification coefficient used in the FCM algorithm and the orders of the polynomials of networks. The performance of the proposed classifier is reported for several benchmarking data-sets and is compared with the performance of other classifiers reported in the previous studies.

  10. Signal Feature Extraction and Quantitative Evaluation of Metal Magnetic Memory Testing for Oil Well Casing Based on Data Preprocessing Technique

    Directory of Open Access Journals (Sweden)

    Zhilin Liu

    2014-01-01

    Full Text Available Metal magnetic memory (MMM technique is an effective method to achieve the detection of stress concentration (SC zone for oil well casing. It can provide an early diagnosis of microdamages for preventive protection. MMM is a natural space domain signal which is weak and vulnerable to noise interference. So, it is difficult to achieve effective feature extraction of MMM signal especially under the hostile subsurface environment of high temperature, high pressure, high humidity, and multiple interfering sources. In this paper, a method of median filter preprocessing based on data preprocessing technique is proposed to eliminate the outliers point of MMM. And, based on wavelet transform (WT, the adaptive wavelet denoising method and data smoothing arithmetic are applied in testing the system of MMM. By using data preprocessing technique, the data are reserved and the noises of the signal are reduced. Therefore, the correct localization of SC zone can be achieved. In the meantime, characteristic parameters in new diagnostic approach are put forward to ensure the reliable determination of casing danger level through least squares support vector machine (LS-SVM and nonlinear quantitative mapping relationship. The effectiveness and feasibility of this method are verified through experiments.

  11. THE EFFECT OF DECOMPOSITION METHOD AS DATA PREPROCESSING ON NEURAL NETWORKS MODEL FOR FORECASTING TREND AND SEASONAL TIME SERIES

    Directory of Open Access Journals (Sweden)

    Subanar Subanar

    2006-01-01

    Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.

  12. Facilitating neuronal connectivity analysis of evoked responses by exposing local activity with principal component analysis preprocessing: simulation of evoked MEG.

    Science.gov (United States)

    Gao, Lin; Zhang, Tongsheng; Wang, Jue; Stephen, Julia

    2013-04-01

    When connectivity analysis is carried out for event related EEG and MEG, the presence of strong spatial correlations from spontaneous activity in background may mask the local neuronal evoked activity and lead to spurious connections. In this paper, we hypothesized PCA decomposition could be used to diminish the background activity and further improve the performance of connectivity analysis in event related experiments. The idea was tested using simulation, where we found that for the 306-channel Elekta Neuromag system, the first 4 PCs represent the dominant background activity, and the source connectivity pattern after preprocessing is consistent with the true connectivity pattern designed in the simulation. Improving signal to noise of the evoked responses by discarding the first few PCs demonstrates increased coherences at major physiological frequency bands when removing the first few PCs. Furthermore, the evoked information was maintained after PCA preprocessing. In conclusion, it is demonstrated that the first few PCs represent background activity, and PCA decomposition can be employed to remove it to expose the evoked activity for the channels under investigation. Therefore, PCA can be applied as a preprocessing approach to improve neuronal connectivity analysis for event related data.

  13. Finding differentially expressed genes in two-channel DNA microarray datasets: how to increase reliability of data preprocessing.

    Science.gov (United States)

    Rotter, Ana; Hren, Matjaz; Baebler, Spela; Blejec, Andrej; Gruden, Kristina

    2008-09-01

    Due to the great variety of preprocessing tools in two-channel expression microarray data analysis it is difficult to choose the most appropriate one for a given experimental setup. In our study, two independent two-channel inhouse microarray experiments as well as a publicly available dataset were used to investigate the influence of the selection of preprocessing methods (background correction, normalization, and duplicate spots correlation calculation) on the discovery of differentially expressed genes. Here we are showing that both the list of differentially expressed genes and the expression values of selected genes depend significantly on the preprocessing approach applied. The choice of normalization method to be used had the highest impact on the results. We propose a simple but efficient approach to increase the reliability of obtained results, where two normalization methods which are theoretically distinct from one another are used on the same dataset. Then the intersection of results, that is, the lists of differentially expressed genes, is used in order to get a more accurate estimation of the genes that were de facto differentially expressed.

  14. Individual savings accounts for social insurance

    DEFF Research Database (Denmark)

    Bovenberg, Lans; Hansen, Martin Ino; Sørensen, Peter Birch

    2008-01-01

    Using Danish data, we find that about three-fourths of the taxes levied to finance public transfers actually finance benefits that redistribute income over the life cycle of individual taxpayers rather than redistribute resources across people. This finding and similar results for other countries...... provide a rationale for financing part of social insurance via mandatory individual savings accounts. We discuss the advantages and disadvantages of mandatory individual savings accounts for social insurance and survey some recent alternative proposals for such accounts...

  15. Financial Literacy and Retirement Savings in Germany

    OpenAIRE

    2012-01-01

    The German pension reforms in 2001 and 2004 increased the importance of private supplemental savings for retirement. Calculating the appropriate retirement income needed and choosing the right product postulates some degree of financial knowledge, also referred to as financial literacy. This paper investigates the relationship between financial literacy and private retirement savings. Germans seem to have a good grasp of basic financial concepts. However, individuals with low education face s...

  16. Potential energy savings and thermal comfort

    DEFF Research Database (Denmark)

    Jensen, Karsten Ingerslev; Rudbeck, Claus Christian; Schultz, Jørgen Munthe

    1996-01-01

    The simulation results on the energy saving potential and influence on indoor thermal comfort by replacement of common windows with aerogel windows as well as commercial low-energy windows are described and analysed.......The simulation results on the energy saving potential and influence on indoor thermal comfort by replacement of common windows with aerogel windows as well as commercial low-energy windows are described and analysed....

  17. How America Saved Italy and the World

    Science.gov (United States)

    2015-05-21

    normalcy, specifically the International Monetary Fund and the International Bank for Reconstruction and Development.145 This meant that America would...How America Saved Italy and the World A Monograph by MAJ Kwame O. Boateng United States Army School of Advanced Military Studies United...DATES COVERED (From - To) July 2014 – May 2015 4. TITLE AND SUBTITLE How America Saved Italy and the World 5a. CONTRACT NUMBER 5b. GRANT NUMBER

  18. Global imbalances, saving glut and investment strike.

    OpenAIRE

    Moëc, G.; Frey, L

    2006-01-01

    The present state of the global economy is characterised by persistent and increasingly polarised current account imbalances, in a context of historically low long-term interest rates, which stand below the equilibrium levels proxied by potential growth and trend inflation. A comprehensive analysis by Ben Bernanke1 attributes those two phenomena to one common cause: a global saving glut outside the United States. The approach below is more pessimistic than the global saving glut theory as far...

  19. Free Modal Algebras Revisited: The Step-by-Step Method

    NARCIS (Netherlands)

    Bezhanishvili, N.; Ghilardi, Silvio; Jibladze, Mamuka

    2012-01-01

    We review the step-by-step method of constructing finitely generated free modal algebras. First we discuss the global step-by-step method, which works well for rank one modal logics. Next we refine the global step-by-step method to obtain the local step-by-step method, which is applicable beyond ran

  20. Creation of Carbon Credits by Water Saving

    Directory of Open Access Journals (Sweden)

    Yasutoshi Shimizu

    2012-07-01

    Full Text Available Until now, as a way of reducing greenhouse gas emissions from Japanese homes, the emphasis has been on reduction of energy consumption for air-conditioning and lighting. In recent years, there has been progress in CO2 emission reduction through research into the water-saving performance of bathroom fixtures such as toilets and showers. Simulations have shown that CO2 emissions associated with water consumption in Japanese homes can be reduced by 25% (1% of Japan’s total CO2 emissions by 2020 through the adoption of the use of water-saving fixtures. In response to this finding, a program to promote the replacement of current fixtures with water-saving toilet bowls and thermally insulated bathtubs has been added to the Government of Japan’s energy-saving policy. Furthermore, CO2 emission reduction through widespread use of water-saving fixtures has been adopted by the domestic credit system promoted by the Government of Japan as a way of achieving CO2 emission-reduction targets; application of this credit system has also begun. As part of a bilateral offset credit mechanism promoted by the Government of Japan, research to evaluate the CO2 reduction potential of the adoption of water-saving fixtures has been done in the city of Dalian, in China.

  1. RECRUITMENT FINANCED BY SAVED LEAVE (RSL PROGRAMME)

    CERN Multimedia

    Division du Personnel; Tel. 73903

    1999-01-01

    Transfer to the saved leave account and saved leave bonusStaff members participating in the RSL programme may opt to transfer up to 10 days of unused annual leave or unused compensatory leave into their saved leave account, at the end of the leave year, i.e. 30 September (as set out in the implementation procedure dated 27 August 1997).A leave transfer request form, which you should complete, sign and return, if you wish to use this possibility, has been addressed you. To allow the necessary time for the processing of your request, you should return it without delay.As foreseen in the implementation procedure, an additional day of saved leave will be granted for each full period of 20 days remaining in the saved leave account on 31 December 1999, for any staff member participating in the RSL programme until that date.For part-time staff members participating in the RSL programme, the above-mentioned days of leave (annual, compensatory and saved) are adjusted proportionally to their contractual working week as...

  2. Lowering medical costs through the sharing of savings by physicians and patients: inclusive shared savings.

    Science.gov (United States)

    Schmidt, Harald; Emanuel, Ezekiel J

    2014-12-01

    Current approaches to controlling health care costs have strengths and weaknesses. We propose an alternative, "inclusive shared savings," that aims to lower medical costs through savings that are shared by physicians and patients. Inclusive shared savings may be particularly attractive in situations in which treatments, such as those for gastric cancer, are similar in clinical effectiveness and have modest differences in convenience but substantially differ in cost. Inclusive shared savings incorporates features of typical insurance coverage, shared savings, and value-based insurance design but differs from value-based insurance design, which merely seeks to decrease or eliminate out-of-pocket costs. Inclusive shared savings offers financial incentives to physicians and patients to promote the use of lower-cost, but equally effective, interventions and should be evaluated in a rigorous trial or demonstration project.

  3. Diabetes PSA (:60) Step By Step

    Centers for Disease Control (CDC) Podcasts

    2009-10-24

    First steps to preventing diabetes. For Hispanic and Latino American audiences.  Created: 10/24/2009 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 10/24/2009.

  4. Diabetes PSA (:30) Step By Step

    Centers for Disease Control (CDC) Podcasts

    2009-10-24

    First steps to preventing diabetes. For Hispanic and Latino American audiences.  Created: 10/24/2009 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 10/24/2009.

  5. 48 CFR 1843.7101 - Shared Savings Program.

    Science.gov (United States)

    2010-10-01

    ... ADMINISTRATION CONTRACT MANAGEMENT CONTRACT MODIFICATIONS Shared Savings 1843.7101 Shared Savings Program. This..., significant cost reduction initiatives. NASA will benefit as the more efficient business practices that...

  6. Small Town Energy Program (STEP) Final Report revised

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Charles (Chuck) T.

    2014-01-02

    University Park, Maryland (“UP”) is a small town of 2,540 residents, 919 homes, 2 churches, 1 school, 1 town hall, and 1 breakthrough community energy efficiency initiative: the Small Town Energy Program (“STEP”). STEP was developed with a mission to “create a model community energy transformation program that serves as a roadmap for other small towns across the U.S.” STEP first launched in January 2011 in UP and expanded in July 2012 to the neighboring communities of Hyattsville, Riverdale Park, and College Heights Estates, MD. STEP, which concluded in July 2013, was generously supported by a grant from the U.S. Department of Energy (DOE). The STEP model was designed for replication in other resource-constrained small towns similar to University Park - a sector largely neglected to date in federal and state energy efficiency programs. STEP provided a full suite of activities for replication, including: energy audits and retrofits for residential buildings, financial incentives, a community-based social marketing backbone and local community delivery partners. STEP also included the highly innovative use of an “Energy Coach” who worked one-on-one with clients throughout the program. Please see www.smalltownenergy.org for more information. In less than three years, STEP achieved the following results in University Park: • 30% of community households participated voluntarily in STEP; • 25% of homes received a Home Performance with ENERGY STAR assessment; • 16% of households made energy efficiency improvements to their home; • 64% of households proceeded with an upgrade after their assessment; • 9 Full Time Equivalent jobs were created or retained, and 39 contractors worked on STEP over the course of the project. Estimated Energy Savings - Program Totals kWh Electricity 204,407 Therms Natural Gas 24,800 Gallons of Oil 2,581 Total Estimated MMBTU Saved (Source Energy) 5,474 Total Estimated Annual Energy Cost Savings $61,343 STEP clients who

  7. Miniature, Low Power Gas Chromatograph with Sample Pre-Processing Capability and Enhanced G-Force Survivability for Planetary Missions Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Thorleaf Research, Inc. proposes to develop a miniaturized, low power gas chromatograph (GC) with sample pre-processing capability and enhanced capability for...

  8. Microsoft Office Word 2007 step by step

    CERN Document Server

    Cox, Joyce

    2007-01-01

    Experience learning made easy-and quickly teach yourself how to create impressive documents with Word 2007. With Step By Step, you set the pace-building and practicing the skills you need, just when you need them!Apply styles and themes to your document for a polished lookAdd graphics and text effects-and see a live previewOrganize information with new SmartArt diagrams and chartsInsert references, footnotes, indexes, a table of contentsSend documents for review and manage revisionsTurn your ideas into blogs, Web pages, and moreYour all-in-one learning experience includes:Files for building sk

  9. PreP+07: improvements of a user friendly tool to preprocess and analyse microarray data

    Directory of Open Access Journals (Sweden)

    Claros M Gonzalo

    2009-01-01

    Full Text Available Abstract Background Nowadays, microarray gene expression analysis is a widely used technology that scientists handle but whose final interpretation usually requires the participation of a specialist. The need for this participation is due to the requirement of some background in statistics that most users lack or have a very vague notion of. Moreover, programming skills could also be essential to analyse these data. An interactive, easy to use application seems therefore necessary to help researchers to extract full information from data and analyse them in a simple, powerful and confident way. Results PreP+07 is a standalone Windows XP application that presents a friendly interface for spot filtration, inter- and intra-slide normalization, duplicate resolution, dye-swapping, error removal and statistical analyses. Additionally, it contains two unique implementation of the procedures – double scan and Supervised Lowess-, a complete set of graphical representations – MA plot, RG plot, QQ plot, PP plot, PN plot – and can deal with many data formats, such as tabulated text, GenePix GPR and ArrayPRO. PreP+07 performance has been compared with the equivalent functions in Bioconductor using a tomato chip with 13056 spots. The number of differentially expressed genes considering p-values coming from the PreP+07 and Bioconductor Limma packages were statistically identical when the data set was only normalized; however, a slight variability was appreciated when the data was both normalized and scaled. Conclusion PreP+07 implementation provides a high degree of freedom in selecting and organizing a small set of widely used data processing protocols, and can handle many data formats. Its reliability has been proven so that a laboratory researcher can afford a statistical pre-processing of his/her microarray results and obtain a list of differentially expressed genes using PreP+07 without any programming skills. All of this gives support to scientists

  10. Energy saving opportunities in Jordanian pharmaceutical industries

    Directory of Open Access Journals (Sweden)

    Areen Al-Ali

    2012-05-01

    Full Text Available An investigation of energy consumption saving opportunities in Jordanian pharmaceutical industry has been carried out in this paper, current status of energy consumption, possible saving techniques, and recommendations that could be implemented successfully through Energy Saving Program (ESP is explored. All variables that influence energy consumption are considered accordingly using a suitable methodology. This methodology integrates a simulation programs into the analysis process; thereafter an accurate analysis and a reliable assessment for energy consumption is provided. HAP 4.41software has been used to simulate and to calculate energy consumption and costs. One biggest Jordanian pharmaceutical facility that produces Penicillin and Cephalosporin products is considered as an illustrative example, the facility consists of four main energy consumption centers and includes consuming systems namely; HVAC (Heating, Ventilating and Air Conditioning system, lighting, compressed air, steam boilers, and miscellaneous equipments like computers, refrigerators, fume hoods, etc. Historical energy consumption quantities are estimated, details of energy data are tabulated accordingly input data files for the computer simulation method using HAP 4.41 are created. Energy saving recommendations has been decided and incorporated into the ESP system. The paper concluded with a good quality room of energy saving opportunities that mirrored positively on the national energy bill with a reduction of 13% of the total annual energy consumption.

  11. THE PUZZLE OF SIMULTANEOUS SAVINGS AND DEBTS

    Directory of Open Access Journals (Sweden)

    RODICA IANOLE

    2012-05-01

    Full Text Available „Neither a borrower nor a lender be” recommends Shakespeare in Hamlet. The advice seems particularly interesting in nowadays society where a person can be easily found in both approximate situations, in the same time. It goes without saying that saving and borrowing do not describe mutually exclusive strategies of financial management and thus many people retain savings or carry on saving at the same time as having debts. We add to this fact a more pragmatically wisdom, the one of the economist Robert Solow -“We (economists think of wealth as fungible; we think a dollar is a dollar. Why don't they (the others do so?” (Solow, 1987 – and we naturally ask ourselves if the mechanism of having simultaneous savings and debts is a rational one, according to traditional economics.Making appeal to the emerging body of behavioral economics literature we reach to the mental accounting theory to see if it can explain savings inclination versus debt inclination. The main research question we want to explore is the following: if mental accounting prevents people from spending money from one „mental account” on goods belonging to another one, will people – after using all their money from a given account – be willing to go into debt to buy goods belonging to this account in a situation when they still have money in other accounts?

  12. Energy Savings from Industrial Water Reductions

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Prakash; McKane, Aimee; de Fontaine, Andre

    2015-08-03

    Although it is widely recognized that reducing freshwater consumption is of critical importance, generating interest in industrial water reduction programs can be hindered for a variety of reasons. These include the low cost of water, greater focus on water use in other sectors such as the agriculture and residential sectors, high levels of unbilled and/or unregulated self-supplied water use in industry, and lack of water metering and tracking capabilities at industrial facilities. However, there are many additional components to the resource savings associated with reducing site water use beyond the water savings alone, such as reductions in energy consumption, greenhouse gas emissions, treatment chemicals, and impact on the local watershed. Understanding and quantifying these additional resource savings can expand the community of businesses, NGOs, government agencies, and researchers with a vested interest in water reduction. This paper will develop a methodology for evaluating the embedded energy consumption associated with water use at an industrial facility. The methodology developed will use available data and references to evaluate the energy consumption associated with water supply and wastewater treatment outside of a facility’s fence line for various water sources. It will also include a framework for evaluating the energy consumption associated with water use within a facility’s fence line. The methodology will develop a more complete picture of the total resource savings associated with water reduction efforts and allow industrial water reduction programs to assess the energy and CO2 savings associated with their efforts.

  13. Short-Term Saved Leave Scheme

    CERN Multimedia

    HR Department

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new im-plementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme ...

  14. Short-Term Saved Leave Scheme

    CERN Multimedia

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new implementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme a...

  15. A New Unsupervised Pre-processing Algorithm Based on Artificial Immune System for ERP Assessment in a P300-based GKT

    Directory of Open Access Journals (Sweden)

    S. Shojaeilangari

    2012-09-01

    Full Text Available In recent years, an increasing number of researches have been focused on bio-inspired algorithms to solve the elaborate engineering problems. Artificial Immune System (AIS is an artificial intelligence technique which has potential of solving problems in various fields. The immune system, due to self-regulating nature, has been an inspiration source of unsupervised learning methods for pattern recognition task. The purpose of this study is to apply the AIS to pre-process the lie-detection dataset to promote the recognition of guilty and innocent subjects. A new Unsupervised AIS (UAIS was proposed in this study as a pre-processing method before classification. Then, we applied three different classifiers on pre-processed data for Event Related Potential (ERP assessment in a P300-based Guilty Knowledge Test (GKT. Experiment results showed that UAIS is a successful pre-processing method which is able to improve the classification rate. In our experiments, we observed that the classification accuracies for three different classifiers: K-Nearest Neighbourhood (KNN, Support Vector Machine (SVM and Linear Discriminant Analysis (LDA were increased after applying UAIS pre-processing. Using of scattering criterion to assessment the features before and after pre-processing proved that our proposed method was able to perform data mapping from a primary feature space to a new area where the data separability was improved significantly.

  16. Feeding your piggy bank with intentions: A study on saving behaviour, saving strategies, and happiness

    NARCIS (Netherlands)

    De Francisco Vela, S.; Desmet, P.M.A.; Casais, M.

    2014-01-01

    The act of saving money can connect one’s present state to a meaningful future state, especially if we consider money not as a direct source of happiness, but as a resource for engaging in meaningful activities. To explore how design can contribute to making the act of saving more meaningful, we

  17. Saving for Success: Financial Education and Savings Goal Achievement in Individual Development Accounts

    Science.gov (United States)

    Grinstead, Mary L.; Mauldin, Teresa; Sabia, Joseph J.; Koonce, Joan; Palmer, Lance

    2011-01-01

    Using microdata from the American Dream Demonstration, the current study examines factors associated with savings and savings goal achievement (indicated by a matched withdrawal) among participants of individual development account (IDA) programs. Multinomial logit results show that hours of participation in financial education programs, higher…

  18. Feeding your piggy bank with intentions: A study on saving behaviour, saving strategies, and happiness

    NARCIS (Netherlands)

    De Francisco Vela, S.; Desmet, P.M.A.; Casais, M.

    2014-01-01

    The act of saving money can connect one’s present state to a meaningful future state, especially if we consider money not as a direct source of happiness, but as a resource for engaging in meaningful activities. To explore how design can contribute to making the act of saving more meaningful, we con

  19. 12 CFR 583.20 - Savings and loan holding company.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Savings and loan holding company. 583.20... REGULATIONS AFFECTING SAVINGS AND LOAN HOLDING COMPANIES § 583.20 Savings and loan holding company. The term savings and loan holding company means any company that directly or indirectly controls a...

  20. Do Preschoolers Save to Benefit Their Future Selves?

    Science.gov (United States)

    Metcalf, Jennifer L.; Atance, Cristina M.

    2011-01-01

    Using a new paradigm for measuring children's saving behaviors involving two marble games differing in desirability, we assessed whether 3-, 4-, and 5-year-olds saved marbles for future use, saved increasingly on a second trial, saved increasingly with age, and were sensitive to the relative value of future rewards. We also assessed whether…