WorldWideScience

Sample records for preprocessing step saving

  1. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI.

    Science.gov (United States)

    Gargouri, Fatma; Kallel, Fathi; Delphine, Sebastien; Ben Hamida, Ahmed; Lehéricy, Stéphane; Valabregue, Romain

    2018-01-01

    Resting state functional MRI (rs-fMRI) is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step) and the scr (where we applied realignment, tCompCor and smoothing as a final step) strategies had the highest mean values of global efficiency (eg) . Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step), had the highest mean local efficiency (el) values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency.

  2. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI

    Directory of Open Access Journals (Sweden)

    Fatma Gargouri

    2018-02-01

    Full Text Available Resting state functional MRI (rs-fMRI is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step and the scr (where we applied realignment, tCompCor and smoothing as a final step strategies had the highest mean values of global efficiency (eg. Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step, had the highest mean local efficiency (el values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency.

  3. The Influence of Preprocessing Steps on Graph Theory Measures Derived from Resting State fMRI

    Science.gov (United States)

    Gargouri, Fatma; Kallel, Fathi; Delphine, Sebastien; Ben Hamida, Ahmed; Lehéricy, Stéphane; Valabregue, Romain

    2018-01-01

    Resting state functional MRI (rs-fMRI) is an imaging technique that allows the spontaneous activity of the brain to be measured. Measures of functional connectivity highly depend on the quality of the BOLD signal data processing. In this study, our aim was to study the influence of preprocessing steps and their order of application on small-world topology and their efficiency in resting state fMRI data analysis using graph theory. We applied the most standard preprocessing steps: slice-timing, realign, smoothing, filtering, and the tCompCor method. In particular, we were interested in how preprocessing can retain the small-world economic properties and how to maximize the local and global efficiency of a network while minimizing the cost. Tests that we conducted in 54 healthy subjects showed that the choice and ordering of preprocessing steps impacted the graph measures. We found that the csr (where we applied realignment, smoothing, and tCompCor as a final step) and the scr (where we applied realignment, tCompCor and smoothing as a final step) strategies had the highest mean values of global efficiency (eg). Furthermore, we found that the fscr strategy (where we applied realignment, tCompCor, smoothing, and filtering as a final step), had the highest mean local efficiency (el) values. These results confirm that the graph theory measures of functional connectivity depend on the ordering of the processing steps, with the best results being obtained using smoothing and tCompCor as the final steps for global efficiency with additional filtering for local efficiency. PMID:29497372

  4. Discrete pre-processing step effects in registration-based pipelines, a preliminary volumetric study on T1-weighted images.

    Science.gov (United States)

    Muncy, Nathan M; Hedges-Muncy, Ariana M; Kirwan, C Brock

    2017-01-01

    Pre-processing MRI scans prior to performing volumetric analyses is common practice in MRI studies. As pre-processing steps adjust the voxel intensities, the space in which the scan exists, and the amount of data in the scan, it is possible that the steps have an effect on the volumetric output. To date, studies have compared between and not within pipelines, and so the impact of each step is unknown. This study aims to quantify the effects of pre-processing steps on volumetric measures in T1-weighted scans within a single pipeline. It was our hypothesis that pre-processing steps would significantly impact ROI volume estimations. One hundred fifteen participants from the OASIS dataset were used, where each participant contributed three scans. All scans were then pre-processed using a step-wise pipeline. Bilateral hippocampus, putamen, and middle temporal gyrus volume estimations were assessed following each successive step, and all data were processed by the same pipeline 5 times. Repeated-measures analyses tested for a main effects of pipeline step, scan-rescan (for MRI scanner consistency) and repeated pipeline runs (for algorithmic consistency). A main effect of pipeline step was detected, and interestingly an interaction between pipeline step and ROI exists. No effect for either scan-rescan or repeated pipeline run was detected. We then supply a correction for noise in the data resulting from pre-processing.

  5. Conversation on data mining strategies in LC-MS untargeted metabolomics: pre-processing and pre-treatment steps

    CSIR Research Space (South Africa)

    Tugizimana, F

    2016-11-01

    Full Text Available -MS)-based untargeted metabolomic dataset, this study explored the influence of collection parameters in the data pre-processing step, scaling and data transformation on the statistical models generated, and feature selection, thereafter. Data obtained in positive mode...

  6. A Conversation on Data Mining Strategies in LC-MS Untargeted Metabolomics: Pre-Processing and Pre-Treatment Steps

    Directory of Open Access Journals (Sweden)

    Fidele Tugizimana

    2016-11-01

    Full Text Available Untargeted metabolomic studies generate information-rich, high-dimensional, and complex datasets that remain challenging to handle and fully exploit. Despite the remarkable progress in the development of tools and algorithms, the “exhaustive” extraction of information from these metabolomic datasets is still a non-trivial undertaking. A conversation on data mining strategies for a maximal information extraction from metabolomic data is needed. Using a liquid chromatography-mass spectrometry (LC-MS-based untargeted metabolomic dataset, this study explored the influence of collection parameters in the data pre-processing step, scaling and data transformation on the statistical models generated, and feature selection, thereafter. Data obtained in positive mode generated from a LC-MS-based untargeted metabolomic study (sorghum plants responding dynamically to infection by a fungal pathogen were used. Raw data were pre-processed with MarkerLynxTM software (Waters Corporation, Manchester, UK. Here, two parameters were varied: the intensity threshold (50–100 counts and the mass tolerance (0.005–0.01 Da. After the pre-processing, the datasets were imported into SIMCA (Umetrics, Umea, Sweden for more data cleaning and statistical modeling. In addition, different scaling (unit variance, Pareto, etc. and data transformation (log and power methods were explored. The results showed that the pre-processing parameters (or algorithms influence the output dataset with regard to the number of defined features. Furthermore, the study demonstrates that the pre-treatment of data prior to statistical modeling affects the subspace approximation outcome: e.g., the amount of variation in X-data that the model can explain and predict. The pre-processing and pre-treatment steps subsequently influence the number of statistically significant extracted/selected features (variables. Thus, as informed by the results, to maximize the value of untargeted metabolomic data

  7. A First Step in Learning Analytics: Pre-Processing Low-Level Alice Logging Data of Middle School Students

    Science.gov (United States)

    Werner, Linda; McDowell, Charlie; Denner, Jill

    2013-01-01

    Educational data mining can miss or misidentify key findings about student learning without a transparent process of analyzing the data. This paper describes the first steps in the process of using low-level logging data to understand how middle school students used Alice, an initial programming environment. We describe the steps that were…

  8. Spectral Difference in the Image Domain for Large Neighborhoods, a GEOBIA Pre-Processing Step for High Resolution Imagery

    Directory of Open Access Journals (Sweden)

    Roeland de Kok

    2012-08-01

    Full Text Available Contrast plays an important role in the visual interpretation of imagery. To mimic visual interpretation and using contrast in a Geographic Object Based Image Analysis (GEOBIA environment, it is useful to consider an analysis for single pixel objects. This should be done before applying homogeneity criteria in the aggregation of pixels for the construction of meaningful image objects. The habit or “best practice” to start GEOBIA with pixel aggregation into homogeneous objects should come with the awareness that feature attributes for single pixels are at risk of becoming less accessible for further analysis. Single pixel contrast with image convolution on close neighborhoods is a standard technique, also applied in edge detection. This study elaborates on the analysis of close as well as much larger neighborhoods inside the GEOBIA domain. The applied calculations are limited to the first segmentation step for single pixel objects in order to produce additional feature attributes for objects of interest to be generated in further aggregation processes. The equation presented functions at a level that is considered an intermediary product in the sequential processing of imagery. The procedure requires intensive processor and memory capacity. The resulting feature attributes highlight not only contrasting pixels (edges but also contrasting areas of local pixel groups. The suggested approach can be extended and becomes useful in classifying artificial areas at national scales using high resolution satellite mosaics.

  9. Development and integration of block operations for data invariant automation of digital preprocessing and analysis of biological and biomedical Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2015-06-01

    High-throughput information extraction from large numbers of Raman spectra is becoming an increasingly taxing problem due to the proliferation of new applications enabled using advances in instrumentation. Fortunately, in many of these applications, the entire process can be automated, yielding reproducibly good results with significant time and cost savings. Information extraction consists of two stages, preprocessing and analysis. We focus here on the preprocessing stage, which typically involves several steps, such as calibration, background subtraction, baseline flattening, artifact removal, smoothing, and so on, before the resulting spectra can be further analyzed. Because the results of some of these steps can affect the performance of subsequent ones, attention must be given to the sequencing of steps, the compatibility of these sequences, and the propensity of each step to generate spectral distortions. We outline here important considerations to effect full automation of Raman spectral preprocessing: what is considered full automation; putative general principles to effect full automation; the proper sequencing of processing and analysis steps; conflicts and circularities arising from sequencing; and the need for, and approaches to, preprocessing quality control. These considerations are discussed and illustrated with biological and biomedical examples reflecting both successful and faulty preprocessing.

  10. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  11. Practical Secure Computation with Pre-Processing

    DEFF Research Database (Denmark)

    Zakarias, Rasmus Winther

    Secure Multiparty Computation has been divided between protocols best suited for binary circuits and protocols best suited for arithmetic circuits. With their MiniMac protocol in [DZ13], Damgård and Zakarias take an important step towards bridging these worlds with an arithmetic protocol tuned...... space for pre-processing material than computing the non-linear parts online (depends on the quality of circuit of course). Surprisingly, even for our optimized AES-circuit this is not the case. We further improve the design of the pre-processing material and end up with only 10 megabyes of pre...... a protocol for small field arithmetic to do fast large integer multipli- cations. This is achieved by devising pre-processing material that allows the Toom-Cook multiplication algorithm to run between the parties with linear communication complexity. With this result computation on the CPU by the parties...

  12. Reliable RANSAC Using a Novel Preprocessing Model

    Directory of Open Access Journals (Sweden)

    Xiaoyan Wang

    2013-01-01

    Full Text Available Geometric assumption and verification with RANSAC has become a crucial step for corresponding to local features due to its wide applications in biomedical feature analysis and vision computing. However, conventional RANSAC is very time-consuming due to redundant sampling times, especially dealing with the case of numerous matching pairs. This paper presents a novel preprocessing model to explore a reduced set with reliable correspondences from initial matching dataset. Both geometric model generation and verification are carried out on this reduced set, which leads to considerable speedups. Afterwards, this paper proposes a reliable RANSAC framework using preprocessing model, which was implemented and verified using Harris and SIFT features, respectively. Compared with traditional RANSAC, experimental results show that our method is more efficient.

  13. Data preprocessing in data mining

    CERN Document Server

    García, Salvador; Herrera, Francisco

    2015-01-01

    Data Preprocessing for Data Mining addresses one of the most important issues within the well-known Knowledge Discovery from Data process. Data directly taken from the source will likely have inconsistencies, errors or most importantly, it is not ready to be considered for a data mining process. Furthermore, the increasing amount of data in recent science, industry and business applications, calls to the requirement of more complex tools to analyze it. Thanks to data preprocessing, it is possible to convert the impossible into possible, adapting the data to fulfill the input demands of each data mining algorithm. Data preprocessing includes the data reduction techniques, which aim at reducing the complexity of the data, detecting or removing irrelevant and noisy elements from the data. This book is intended to review the tasks that fill the gap between the data acquisition from the source and the data mining process. A comprehensive look from a practical point of view, including basic concepts and surveying t...

  14. Compact Circuit Preprocesses Accelerometer Output

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1993-01-01

    Compact electronic circuit transfers dc power to, and preprocesses ac output of, accelerometer and associated preamplifier. Incorporated into accelerometer case during initial fabrication or retrofit onto commercial accelerometer. Made of commercial integrated circuits and other conventional components; made smaller by use of micrologic and surface-mount technology.

  15. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  16. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    Skibsted, E.; Boelens, H.F.M.; Westerhuis, J.A.; Witte, D.T.; Smilde, A.K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing

  17. New indicator for optimal preprocessing and wavelength selection of near-infrared spectra

    NARCIS (Netherlands)

    Skibsted, E. T. S.; Boelens, H. F. M.; Westerhuis, J. A.; Witte, D. T.; Smilde, A. K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing

  18. Evaluating the impact of image preprocessing on iris segmentation

    Directory of Open Access Journals (Sweden)

    José F. Valencia-Murillo

    2014-08-01

    Full Text Available Segmentation is one of the most important stages in iris recognition systems. In this paper, image preprocessing algorithms are applied in order to evaluate their impact on successful iris segmentation. The preprocessing algorithms are based on histogram adjustment, Gaussian filters and suppression of specular reflections in human eye images. The segmentation method introduced by Masek is applied on 199 images acquired under unconstrained conditions, belonging to the CASIA-irisV3 database, before and after applying the preprocessing algorithms. Then, the impact of image preprocessing algorithms on the percentage of successful iris segmentation is evaluated by means of a visual inspection of images in order to determine if circumferences of iris and pupil were detected correctly. An increase from 59% to 73% in percentage of successful iris segmentation is obtained with an algorithm that combine elimination of specular reflections, followed by the implementation of a Gaussian filter having a 5x5 kernel. The results highlight the importance of a preprocessing stage as a previous step in order to improve the performance during the edge detection and iris segmentation processes.

  19. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  20. Retinal Image Preprocessing: Background and Noise Segmentation

    Directory of Open Access Journals (Sweden)

    Usman Akram

    2012-09-01

    Full Text Available Retinal images are used for the automated screening and diagnosis of diabetic retinopathy. The retinal image quality must be improved for the detection of features and abnormalities and for this purpose preprocessing of retinal images is vital. In this paper, we present a novel automated approach for preprocessing of colored retinal images. The proposed technique improves the quality of input retinal image by separating the background and noisy area from the overall image. It contains coarse segmentation and fine segmentation. Standard retinal images databases Diaretdb0, Diaretdb1, DRIVE and STARE are used to test the validation of our preprocessing technique. The experimental results show the validity of proposed preprocessing technique.

  1. Facilitating Watermark Insertion by Preprocessing Media

    Directory of Open Access Journals (Sweden)

    Matt L. Miller

    2004-10-01

    Full Text Available There are several watermarking applications that require the deployment of a very large number of watermark embedders. These applications often have severe budgetary constraints that limit the computation resources that are available. Under these circumstances, only simple embedding algorithms can be deployed, which have limited performance. In order to improve performance, we propose preprocessing the original media. It is envisaged that this preprocessing occurs during content creation and has no budgetary or computational constraints. Preprocessing combined with simple embedding creates a watermarked Work, the performance of which exceeds that of simple embedding alone. However, this performance improvement is obtained without any increase in the computational complexity of the embedder. Rather, the additional computational burden is shifted to the preprocessing stage. A simple example of this procedure is described and experimental results confirm our assertions.

  2. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery.

    Science.gov (United States)

    Qi, Baogui; Shi, Hao; Zhuang, Yin; Chen, He; Chen, Liang

    2018-04-25

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited.

  3. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery

    Science.gov (United States)

    Qi, Baogui; Zhuang, Yin; Chen, He; Chen, Liang

    2018-01-01

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited. PMID:29693585

  4. Save Energy: Save Money!

    Science.gov (United States)

    Eccli, Eugene; And Others

    This publication is a collection of inexpensive energy saving tips and home improvements for home owners, particularly in low-income areas or in older homes. Section titles are: (1) Keeping Warm; (2) Getting Heat Where You Need It; (3) Using the Sun; (4) Furnaces, Stoves, and Fireplaces; (5) Insulation and Other Energy Needs; (6) Do-It-Yourself…

  5. Optimization of miRNA-seq data preprocessing.

    Science.gov (United States)

    Tam, Shirley; Tsao, Ming-Sound; McPherson, John D

    2015-11-01

    The past two decades of microRNA (miRNA) research has solidified the role of these small non-coding RNAs as key regulators of many biological processes and promising biomarkers for disease. The concurrent development in high-throughput profiling technology has further advanced our understanding of the impact of their dysregulation on a global scale. Currently, next-generation sequencing is the platform of choice for the discovery and quantification of miRNAs. Despite this, there is no clear consensus on how the data should be preprocessed before conducting downstream analyses. Often overlooked, data preprocessing is an essential step in data analysis: the presence of unreliable features and noise can affect the conclusions drawn from downstream analyses. Using a spike-in dilution study, we evaluated the effects of several general-purpose aligners (BWA, Bowtie, Bowtie 2 and Novoalign), and normalization methods (counts-per-million, total count scaling, upper quartile scaling, Trimmed Mean of M, DESeq, linear regression, cyclic loess and quantile) with respect to the final miRNA count data distribution, variance, bias and accuracy of differential expression analysis. We make practical recommendations on the optimal preprocessing methods for the extraction and interpretation of miRNA count data from small RNA-sequencing experiments. © The Author 2015. Published by Oxford University Press.

  6. Textural Analysis of Fatique Crack Surfaces: Image Pre-processing

    Directory of Open Access Journals (Sweden)

    H. Lauschmann

    2000-01-01

    Full Text Available For the fatique crack history reconstitution, new methods of quantitative microfractography are beeing developed based on the image processing and textural analysis. SEM magnifications between micro- and macrofractography are used. Two image pre-processing operatins were suggested and proved to prepare the crack surface images for analytical treatment: 1. Normalization is used to transform the image to a stationary form. Compared to the generally used equalization, it conserves the shape of brightness distribution and saves the character of the texture. 2. Binarization is used to transform the grayscale image to a system of thick fibres. An objective criterion for the threshold brightness value was found as that resulting into the maximum number of objects. Both methods were succesfully applied together with the following textural analysis.

  7. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  8. The 1989 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.; McLaughlin, P.K.

    1989-12-01

    This document summarizes the 1989 version of the ENDF pre-processing codes which are required for processing evaluated nuclear data coded in the format ENDF-4, ENDF-5, or ENDF-6. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  9. Preprocessing Moist Lignocellulosic Biomass for Biorefinery Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Neal Yancey; Christopher T. Wright; Craig Conner; J. Richard Hess

    2009-06-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system of a lignocellulosic biorefinery. Preprocessing is generally accomplished using industrial grinders to format biomass materials into a suitable biorefinery feedstock for conversion to ethanol and other bioproducts. Many factors affect machine efficiency and the physical characteristics of preprocessed biomass. For example, moisture content of the biomass as received from the point of production has a significant impact on overall system efficiency and can significantly affect the characteristics (particle size distribution, flowability, storability, etc.) of the size-reduced biomass. Many different grinder configurations are available on the market, each with advantages under specific conditions. Ultimately, the capacity and/or efficiency of the grinding process can be enhanced by selecting the grinder configuration that optimizes grinder performance based on moisture content and screen size. This paper discusses the relationships of biomass moisture with respect to preprocessing system performance and product physical characteristics and compares data obtained on corn stover, switchgrass, and wheat straw as model feedstocks during Vermeer HG 200 grinder testing. During the tests, grinder screen configuration and biomass moisture content were varied and tested to provide a better understanding of their relative impact on machine performance and the resulting feedstock physical characteristics and uniformity relative to each crop tested.

  10. Protein from preprocessed waste activated sludge as a nutritional supplement in chicken feed.

    Science.gov (United States)

    Chirwa, Evans M N; Lebitso, Moses T

    2014-01-01

    Five groups of broiler chickens were raised on feed containing varying substitutions of single cell protein from preprocessed waste activated sludge (pWAS) in varying compositions of 0:100, 25:75, 50:50, 75:25, and 100:0 pWAS: fishmeal by mass. Forty chickens per batch were evaluated for growth rate, mortality rate, and feed conversion efficiency (ηє). The initial mass gain rate, mortality rate, initial and operational cost analyses showed that protein from pWAS could successfully replace the commercial feed supplements with a significant cost saving without adversely affecting the health of the birds. The chickens raised on preprocessed WAS weighed 19% more than those raised on fishmeal protein supplement over a 45 day test period. Growing chickens on pWAS translated into a 46% cost saving due to the fast growth rate and minimal death losses before maturity.

  11. Preprocessing for Optimization of Probabilistic-Logic Models for Sequence Analysis

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    2009-01-01

    and approximation are needed. The first steps are taken towards a methodology for optimizing such models by approximations using auxiliary models for preprocessing or splitting them into submodels. Evaluation of such approximating models is challenging as authoritative test data may be sparse. On the other hand...

  12. The 1996 ENDF pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1996-01-01

    The codes are named 'the Pre-processing' codes, because they are designed to pre-process ENDF/B data, for later, further processing for use in applications. This is a modular set of computer codes, each of which reads and writes evaluated nuclear data in the ENDF/B format. Each code performs one or more independent operations on the data, as described below. These codes are designed to be computer independent, and are presently operational on every type of computer from large mainframe computer to small personal computers, such as IBM-PC and Power MAC. The codes are available from the IAEA Nuclear Data Section, free of charge upon request. (author)

  13. Boosting reversible pushdown machines by preprocessing

    DEFF Research Database (Denmark)

    Axelsen, Holger Bock; Kutrib, Martin; Malcher, Andreas

    2016-01-01

    languages, whereas for reversible pushdown automata the accepted family of languages lies strictly in between the reversible deterministic context-free languages and the real-time deterministic context-free languages. Moreover, it is shown that the computational power of both types of machines...... is not changed by allowing the preprocessing sequential transducer to work irreversibly. Finally, we examine the closure properties of the family of languages accepted by such machines....

  14. Household Savings

    DEFF Research Database (Denmark)

    Browning, Martin; Lusardi, Annamaria

    suggested in the informal saving literature can be captured in the standard optimizing model. Particular attention is given to recent work on the precautionary motive and its implications for saving and consumption behavior. We also discuss the "behavioral" or "psychological" approach that eschews the use......In this survey, we review the recent theoretical and empirical literature on household saving and consumption. The discussion is structured around a list of motives for saving and how well the standard theory captures these motives. We show that almost all of the motives for saving that have been...

  15. Optimal preprocessing of serum and urine metabolomic data fusion for staging prostate cancer through design of experiment

    International Nuclear Information System (INIS)

    Zheng, Hong; Cai, Aimin; Zhou, Qi; Xu, Pengtao; Zhao, Liangcai; Li, Chen; Dong, Baijun; Gao, Hongchang

    2017-01-01

    Accurate classification of cancer stages will achieve precision treatment for cancer. Metabolomics presents biological phenotypes at the metabolite level and holds a great potential for cancer classification. Since metabolomic data can be obtained from different samples or analytical techniques, data fusion has been applied to improve classification accuracy. Data preprocessing is an essential step during metabolomic data analysis. Therefore, we developed an innovative optimization method to select a proper data preprocessing strategy for metabolomic data fusion using a design of experiment approach for improving the classification of prostate cancer (PCa) stages. In this study, urine and serum samples were collected from participants at five phases of PCa and analyzed using a 1 H NMR-based metabolomic approach. Partial least squares-discriminant analysis (PLS-DA) was used as a classification model and its performance was assessed by goodness of fit (R 2 ) and predictive ability (Q 2 ). Results show that data preprocessing significantly affect classification performance and depends on data properties. Using the fused metabolomic data from urine and serum, PLS-DA model with the optimal data preprocessing (R 2  = 0.729, Q 2  = 0.504, P < 0.0001) can effectively improve model performance and achieve a better classification result for PCa stages as compared with that without data preprocessing (R 2  = 0.139, Q 2  = 0.006, P = 0.450). Therefore, we propose that metabolomic data fusion integrated with an optimal data preprocessing strategy can significantly improve the classification of cancer stages for precision treatment. - Highlights: • NMR metabolomic analysis of body fluids can be used for staging prostate cancer. • Data preprocessing is an essential step for metabolomic analysis. • Data fusion improves information recovery for cancer classification. • Design of experiment achieves optimal preprocessing of metabolomic data fusion.

  16. The 1992 ENDF Pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1992-01-01

    This document summarizes the 1992 version of the ENDF pre-processing codes which are required for processing evaluated nuclear data coded in the format ENDF-4, ENDF-5, or ENDF-6. Included are the codes CONVERT, MERGER, LINEAR, RECENT, SIGMA1, LEGEND, FIXUP, GROUPIE, DICTION, MIXER, VIRGIN, COMPLOT, EVALPLOT, RELABEL. Some of the functions of these codes are: to calculate cross-sections from resonance parameters; to calculate angular distributions, group average, mixtures of cross-sections, etc; to produce graphical plottings and data comparisons. The codes are designed to operate on virtually any type of computer including PC's. They are available from the IAEA Nuclear Data Section, free of charge upon request, on magnetic tape or a set of HD diskettes. (author)

  17. The Effect of Preprocessing on Arabic Document Categorization

    Directory of Open Access Journals (Sweden)

    Abdullah Ayedh

    2016-04-01

    Full Text Available Preprocessing is one of the main components in a conventional document categorization (DC framework. This paper aims to highlight the effect of preprocessing tasks on the efficiency of the Arabic DC system. In this study, three classification techniques are used, namely, naive Bayes (NB, k-nearest neighbor (KNN, and support vector machine (SVM. Experimental analysis on Arabic datasets reveals that preprocessing techniques have a significant impact on the classification accuracy, especially with complicated morphological structure of the Arabic language. Choosing appropriate combinations of preprocessing tasks provides significant improvement on the accuracy of document categorization depending on the feature size and classification techniques. Findings of this study show that the SVM technique has outperformed the KNN and NB techniques. The SVM technique achieved 96.74% micro-F1 value by using the combination of normalization and stemming as preprocessing tasks.

  18. CSS Preprocessing: Tools and Automation Techniques

    Directory of Open Access Journals (Sweden)

    Ricardo Queirós

    2018-01-01

    Full Text Available Cascading Style Sheets (CSS is a W3C specification for a style sheet language used for describing the presentation of a document written in a markup language, more precisely, for styling Web documents. However, in the last few years, the landscape for CSS development has changed dramatically with the appearance of several languages and tools aiming to help developers build clean, modular and performance-aware CSS. These new approaches give developers mechanisms to preprocess CSS rules through the use of programming constructs, defined as CSS preprocessors, with the ultimate goal to bring those missing constructs to the CSS realm and to foster stylesheets structured programming. At the same time, a new set of tools appeared, defined as postprocessors, for extension and automation purposes covering a broad set of features ranging from identifying unused and duplicate code to applying vendor prefixes. With all these tools and techniques in hands, developers need to provide a consistent workflow to foster CSS modular coding. This paper aims to present an introductory survey on the CSS processors. The survey gathers information on a specific set of processors, categorizes them and compares their features regarding a set of predefined criteria such as: maturity, coverage and performance. Finally, we propose a basic set of best practices in order to setup a simple and pragmatic styling code workflow.

  19. The Evaluation of Preprocessing Choices in Single-Subject BOLD fMRI Using NPAIRS Performance Metrics

    DEFF Research Database (Denmark)

    Stephen, LaConte; Rottenberg, David; Strother, Stephen

    2003-01-01

    to obtain cross-validation-based model performance estimates of prediction accuracy and global reproducibility for various degrees of model complexity. We rely on the concept of an analysis chain meta-model in which all parameters of the preprocessing steps along with the final statistical model are treated...

  20. Gravity gradient preprocessing at the GOCE HPF

    Science.gov (United States)

    Bouman, J.; Rispens, S.; Gruber, T.; Schrama, E.; Visser, P.; Tscherning, C. C.; Veicherts, M.

    2009-04-01

    One of the products derived from the GOCE observations are the gravity gradients. These gravity gradients are provided in the Gradiometer Reference Frame (GRF) and are calibrated in-flight using satellite shaking and star sensor data. In order to use these gravity gradients for application in Earth sciences and gravity field analysis, additional pre-processing needs to be done, including corrections for temporal gravity field signals to isolate the static gravity field part, screening for outliers, calibration by comparison with existing external gravity field information and error assessment. The temporal gravity gradient corrections consist of tidal and non-tidal corrections. These are all generally below the gravity gradient error level, which is predicted to show a 1/f behaviour for low frequencies. In the outlier detection the 1/f error is compensated for by subtracting a local median from the data, while the data error is assessed using the median absolute deviation. The local median acts as a high-pass filter and it is robust as is the median absolute deviation. Three different methods have been implemented for the calibration of the gravity gradients. All three methods use a high-pass filter to compensate for the 1/f gravity gradient error. The baseline method uses state-of-the-art global gravity field models and the most accurate results are obtained if star sensor misalignments are estimated along with the calibration parameters. A second calibration method uses GOCE GPS data to estimate a low degree gravity field model as well as gravity gradient scale factors. Both methods allow to estimate gravity gradient scale factors down to the 10-3 level. The third calibration method uses high accurate terrestrial gravity data in selected regions to validate the gravity gradient scale factors, focussing on the measurement band. Gravity gradient scale factors may be estimated down to the 10-2 level with this method.

  1. A survey of visual preprocessing and shape representation techniques

    Science.gov (United States)

    Olshausen, Bruno A.

    1988-01-01

    Many recent theories and methods proposed for visual preprocessing and shape representation are summarized. The survey brings together research from the fields of biology, psychology, computer science, electrical engineering, and most recently, neural networks. It was motivated by the need to preprocess images for a sparse distributed memory (SDM), but the techniques presented may also prove useful for applying other associative memories to visual pattern recognition. The material of this survey is divided into three sections: an overview of biological visual processing; methods of preprocessing (extracting parts of shape, texture, motion, and depth); and shape representation and recognition (form invariance, primitives and structural descriptions, and theories of attention).

  2. Impact of data transformation and preprocessing in supervised ...

    African Journals Online (AJOL)

    Impact of data transformation and preprocessing in supervised learning ... Nowadays, the ideas of integrating machine learning techniques in power system has ... The proposed algorithm used Python-based split train and k-fold model ...

  3. Preprocessing Algorithm for Deciphering Historical Inscriptions Using String Metric

    Directory of Open Access Journals (Sweden)

    Lorand Lehel Toth

    2016-07-01

    Full Text Available The article presents the improvements in the preprocessing part of the deciphering method (shortly preprocessing algorithm for historical inscriptions of unknown origin. Glyphs used in historical inscriptions changed through time; therefore, various versions of the same script may contain different glyphs for each grapheme. The purpose of the preprocessing algorithm is reducing the running time of the deciphering process by filtering out the less probable interpretations of the examined inscription. However, the first version of the preprocessing algorithm leads incorrect outcome or no result in the output in certain cases. Therefore, its improved version was developed to find the most similar words in the dictionary by relaying the search conditions more accurately, but still computationally effectively. Moreover, a sophisticated similarity metric used to determine the possible meaning of the unknown inscription is introduced. The results of the evaluations are also detailed.

  4. Preprocessing of emotional visual information in the human piriform cortex.

    Science.gov (United States)

    Schulze, Patrick; Bestgen, Anne-Kathrin; Lech, Robert K; Kuchinke, Lars; Suchan, Boris

    2017-08-23

    This study examines the processing of visual information by the olfactory system in humans. Recent data point to the processing of visual stimuli by the piriform cortex, a region mainly known as part of the primary olfactory cortex. Moreover, the piriform cortex generates predictive templates of olfactory stimuli to facilitate olfactory processing. This study fills the gap relating to the question whether this region is also capable of preprocessing emotional visual information. To gain insight into the preprocessing and transfer of emotional visual information into olfactory processing, we recorded hemodynamic responses during affective priming using functional magnetic resonance imaging (fMRI). Odors of different valence (pleasant, neutral and unpleasant) were primed by images of emotional facial expressions (happy, neutral and disgust). Our findings are the first to demonstrate that the piriform cortex preprocesses emotional visual information prior to any olfactory stimulation and that the emotional connotation of this preprocessing is subsequently transferred and integrated into an extended olfactory network for olfactory processing.

  5. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  6. Validation of DWI pre-processing procedures for reliable differentiation between human brain gliomas.

    Science.gov (United States)

    Vellmer, Sebastian; Tonoyan, Aram S; Suter, Dieter; Pronin, Igor N; Maximov, Ivan I

    2018-02-01

    Diffusion magnetic resonance imaging (dMRI) is a powerful tool in clinical applications, in particular, in oncology screening. dMRI demonstrated its benefit and efficiency in the localisation and detection of different types of human brain tumours. Clinical dMRI data suffer from multiple artefacts such as motion and eddy-current distortions, contamination by noise, outliers etc. In order to increase the image quality of the derived diffusion scalar metrics and the accuracy of the subsequent data analysis, various pre-processing approaches are actively developed and used. In the present work we assess the effect of different pre-processing procedures such as a noise correction, different smoothing algorithms and spatial interpolation of raw diffusion data, with respect to the accuracy of brain glioma differentiation. As a set of sensitive biomarkers of the glioma malignancy grades we chose the derived scalar metrics from diffusion and kurtosis tensor imaging as well as the neurite orientation dispersion and density imaging (NODDI) biophysical model. Our results show that the application of noise correction, anisotropic diffusion filtering, and cubic-order spline interpolation resulted in the highest sensitivity and specificity for glioma malignancy grading. Thus, these pre-processing steps are recommended for the statistical analysis in brain tumour studies. Copyright © 2017. Published by Elsevier GmbH.

  7. Data pre-processing for web log mining: Case study of commercial bank website usage analysis

    Directory of Open Access Journals (Sweden)

    Jozef Kapusta

    2013-01-01

    Full Text Available We use data cleaning, integration, reduction and data conversion methods in the pre-processing level of data analysis. Data processing techniques improve the overall quality of the patterns mined. The paper describes using of standard pre-processing methods for preparing data of the commercial bank website in the form of the log file obtained from the web server. Data cleaning, as the simplest step of data pre-processing, is non–trivial as the analysed content is highly specific. We had to deal with the problem of frequent changes of the content and even frequent changes of the structure. Regular changes in the structure make use of the sitemap impossible. We presented approaches how to deal with this problem. We were able to create the sitemap dynamically just based on the content of the log file. In this case study, we also examined just the one part of the website over the standard analysis of an entire website, as we did not have access to all log files for the security reason. As the result, the traditional practices had to be adapted for this special case. Analysing just the small fraction of the website resulted in the short session time of regular visitors. We were not able to use recommended methods to determine the optimal value of session time. Therefore, we proposed new methods based on outliers identification for raising the accuracy of the session length in this paper.

  8. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  9. Effects of preprocessing method on TVOC emission of car mat

    Science.gov (United States)

    Wang, Min; Jia, Li

    2013-02-01

    The effects of the mat preprocessing method on total volatile organic compounds (TVOC) emission of car mat are studied in this paper. An appropriate TVOC emission period for car mat is suggested. The emission factors for total volatile organic compounds from three kinds of new car mats are discussed. The car mats are preprocessed by washing, baking and ventilation. When car mats are preprocessed by washing, the TVOC emission for all samples tested are lower than that preprocessed in other methods. The TVOC emission is in stable situation for a minimum of 4 days. The TVOC emitted from some samples may exceed 2500μg/kg. But the TVOC emitted from washed Polyamide (PA) and wool mat is less than 2500μg/kg. The emission factors of total volatile organic compounds (TVOC) are experimentally investigated in the case of different preprocessing methods. The air temperature in environment chamber and the water temperature for washing are important factors influencing on emission of car mats.

  10. Automated pre-processing and multivariate vibrational spectra analysis software for rapid results in clinical settings

    Science.gov (United States)

    Bhattacharjee, T.; Kumar, P.; Fillipe, L.

    2018-02-01

    Vibrational spectroscopy, especially FTIR and Raman, has shown enormous potential in disease diagnosis, especially in cancers. Their potential for detecting varied pathological conditions are regularly reported. However, to prove their applicability in clinics, large multi-center multi-national studies need to be undertaken; and these will result in enormous amount of data. A parallel effort to develop analytical methods, including user-friendly software that can quickly pre-process data and subject them to required multivariate analysis is warranted in order to obtain results in real time. This study reports a MATLAB based script that can automatically import data, preprocess spectra— interpolation, derivatives, normalization, and then carry out Principal Component Analysis (PCA) followed by Linear Discriminant Analysis (LDA) of the first 10 PCs; all with a single click. The software has been verified on data obtained from cell lines, animal models, and in vivo patient datasets, and gives results comparable to Minitab 16 software. The software can be used to import variety of file extensions, asc, .txt., .xls, and many others. Options to ignore noisy data, plot all possible graphs with PCA factors 1 to 5, and save loading factors, confusion matrices and other parameters are also present. The software can provide results for a dataset of 300 spectra within 0.01 s. We believe that the software will be vital not only in clinical trials using vibrational spectroscopic data, but also to obtain rapid results when these tools get translated into clinics.

  11. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    Directory of Open Access Journals (Sweden)

    Miri eMichaeli

    2012-12-01

    Full Text Available High throughput sequencing (HTS yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig genes, which are variable and often highly mutated. This paper describes Ig-HTS-Cleaner (Ig High Throughput Sequencing Cleaner, a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig-Indel-Identifier (Ig Insertion – Deletion Identifier, a program for identifying legitimate and artifact insertions and/or deletions (indels. Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets.

  12. Effect of microaerobic fermentation in preprocessing fibrous lignocellulosic materials.

    Science.gov (United States)

    Alattar, Manar Arica; Green, Terrence R; Henry, Jordan; Gulca, Vitalie; Tizazu, Mikias; Bergstrom, Robby; Popa, Radu

    2012-06-01

    Amending soil with organic matter is common in agricultural and logging practices. Such amendments have benefits to soil fertility and crop yields. These benefits may be increased if material is preprocessed before introduction into soil. We analyzed the efficiency of microaerobic fermentation (MF), also referred to as Bokashi, in preprocessing fibrous lignocellulosic (FLC) organic materials using varying produce amendments and leachate treatments. Adding produce amendments increased leachate production and fermentation rates and decreased the biological oxygen demand of the leachate. Continuously draining leachate without returning it to the fermentors led to acidification and decreased concentrations of polysaccharides (PS) in leachates. PS fragmentation and the production of soluble metabolites and gases stabilized in fermentors in about 2-4 weeks. About 2 % of the carbon content was lost as CO(2). PS degradation rates, upon introduction of processed materials into soil, were similar to unfermented FLC. Our results indicate that MF is insufficient for adequate preprocessing of FLC material.

  13. Real-time topic-aware influence maximization using preprocessing.

    Science.gov (United States)

    Chen, Wei; Lin, Tian; Yang, Cheng

    2016-01-01

    Influence maximization is the task of finding a set of seed nodes in a social network such that the influence spread of these seed nodes based on certain influence diffusion model is maximized. Topic-aware influence diffusion models have been recently proposed to address the issue that influence between a pair of users are often topic-dependent and information, ideas, innovations etc. being propagated in networks are typically mixtures of topics. In this paper, we focus on the topic-aware influence maximization task. In particular, we study preprocessing methods to avoid redoing influence maximization for each mixture from scratch. We explore two preprocessing algorithms with theoretical justifications. Our empirical results on data obtained in a couple of existing studies demonstrate that one of our algorithms stands out as a strong candidate providing microsecond online response time and competitive influence spread, with reasonable preprocessing effort.

  14. Image Processing of Welding Procedure Specification and Pre-process program development for Finite Element Modelling

    International Nuclear Information System (INIS)

    Kim, K. S.; Lee, H. J.

    2009-11-01

    PRE-WELD program, which generates automatically the input file for the finite element analysis on the 2D butt welding at the dissimilar metal weld part, was developed. This program is pre-process program of the FEM code for analyzing the residual stress at the welding parts. Even if the users have not the detail knowledge for the FEM modelling, the users can make the ABAQUS INPUT easily by inputting the shape data of welding part, the weld current and voltage of welding parameters. By using PRE-WELD program, we can save the time and the effort greatly for preparing the ABAQUS INPUT for the residual stress analysis at the welding parts, and make the exact input without the human error

  15. REMINDER: Saved Leave Scheme (SLS)

    CERN Multimedia

    2003-01-01

    Transfer of leave to saved leave accounts Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'* annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No 22B) can be transferred to the saved leave account at the end of the leave year (30 September). We remind you that unused leave of all those taking part in the saved leave scheme at the closure of the leave year accounts is transferred automatically to the saved leave account on that date. Therefore, staff members have no administrative steps to take. In addition, the transfer, which eliminates the risk of omitting to request leave transfers and rules out calculation errors in transfer requests, will be clearly shown in the list of leave transactions that can be consulted in EDH from October 2003 onwards. Furthermore, this automatic leave transfer optimizes staff members' chances of benefiting from a saved leave bonus provided that they ar...

  16. Net savings

    International Nuclear Information System (INIS)

    Roche, P.

    2001-01-01

    The state of e-commerce in the Canadian upstream oil and natural gas sector is examined in an effort to discover the extent to which the .com economy has penetrated the marketplace. The overall assessment is that although the situation varies from producer to producer and process to process, a bustling digital marketplace in the Canadian oil business has yet to emerge. Nevertheless, there are several examples of companies using e-business tools to minimize technology staffing and to eliminate wasteful practices. Initiatives cited include streamlining of supply chains to cut handling costs, using application service providers to trim information technology budgets, and adopting electronic joint interest billing to save on printing, postage and re-entering data. Most notable efforts have been made by companies such as BXL Energy Limited and Genesis Exploration Limited, both of which are boosting efficiency on the inside by contracting out data storage and software applications. For example, BXL has replaced its microfilm log library occupying six cabinets, and totalling about 9,000 lbs., by a fibre optic line. All applications can now be run from a laptop which weighs three to four pounds. In a similar vein, Genesis Exploration started using application service providers (ASPs) to avoid the cost and hassle of buying and maintaining major software applications in-house. By accessing the ASPs, Genesis staff can run software without buying or installing it on their own computers. In yet another example of cutting information technology costs, Pengrowth Corporation has its network administration done remotely over the Internet by Northwest Digital Systems (NWD). As far as the industry at large is concerned, the answer appears to be in a digital marketplace specifically tailored to the upstream sector's unique profile. As a start, a study is underway by Deloitte Consulting to explore producer interest in joining or founding an upstream digital marketplace. The study was

  17. Net savings

    Energy Technology Data Exchange (ETDEWEB)

    Roche, P.

    2001-02-01

    The state of e-commerce in the Canadian upstream oil and natural gas sector is examined in an effort to discover the extent to which the .com economy has penetrated the marketplace. The overall assessment is that although the situation varies from producer to producer and process to process, a bustling digital marketplace in the Canadian oil business has yet to emerge. Nevertheless, there are several examples of companies using e-business tools to minimize technology staffing and to eliminate wasteful practices. Initiatives cited include streamlining of supply chains to cut handling costs, using application service providers to trim information technology budgets, and adopting electronic joint interest billing to save on printing, postage and re-entering data. Most notable efforts have been made by companies such as BXL Energy Limited and Genesis Exploration Limited, both of which are boosting efficiency on the inside by contracting out data storage and software applications. For example, BXL has replaced its microfilm log library occupying six cabinets, and totalling about 9,000 lbs., by a fibre optic line. All applications can now be run from a laptop which weighs three to four pounds. In a similar vein, Genesis Exploration started using application service providers (ASPs) to avoid the cost and hassle of buying and maintaining major software applications in-house. By accessing the ASPs, Genesis staff can run software without buying or installing it on their own computers. In yet another example of cutting information technology costs, Pengrowth Corporation has its network administration done remotely over the Internet by Northwest Digital Systems (NWD). As far as the industry at large is concerned, the answer appears to be in a digital marketplace specifically tailored to the upstream sector's unique profile. As a start, a study is underway by Deloitte Consulting to explore producer interest in joining or founding an upstream digital marketplace. The study

  18. Preprocessing of 18F-DMFP-PET Data Based on Hidden Markov Random Fields and the Gaussian Distribution

    Directory of Open Access Journals (Sweden)

    Fermín Segovia

    2017-10-01

    Full Text Available 18F-DMFP-PET is an emerging neuroimaging modality used to diagnose Parkinson's disease (PD that allows us to examine postsynaptic dopamine D2/3 receptors. Like other neuroimaging modalities used for PD diagnosis, most of the total intensity of 18F-DMFP-PET images is concentrated in the striatum. However, other regions can also be useful for diagnostic purposes. An appropriate delimitation of the regions of interest contained in 18F-DMFP-PET data is crucial to improve the automatic diagnosis of PD. In this manuscript we propose a novel methodology to preprocess 18F-DMFP-PET data that improves the accuracy of computer aided diagnosis systems for PD. First, the data were segmented using an algorithm based on Hidden Markov Random Field. As a result, each neuroimage was divided into 4 maps according to the intensity and the neighborhood of the voxels. The maps were then individually normalized so that the shape of their histograms could be modeled by a Gaussian distribution with equal parameters for all the neuroimages. This approach was evaluated using a dataset with neuroimaging data from 87 parkinsonian patients. After these preprocessing steps, a Support Vector Machine classifier was used to separate idiopathic and non-idiopathic PD. Data preprocessed by the proposed method provided higher accuracy results than the ones preprocessed with previous approaches.

  19. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  20. Orthogonal feature selection method. [For preprocessing of man spectral data

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, B R [Univ. of Washington, Seattle; Bender, C F

    1976-01-01

    A new method of preprocessing spectral data for extraction of molecular structural information is desired. This SELECT method generates orthogonal features that are important for classification purposes and that also retain their identity to the original measurements. A brief introduction to chemical pattern recognition is presented. A brief description of the method and an application to mass spectral data analysis follow. (BLM)

  1. Image preprocessing study on KPCA-based face recognition

    Science.gov (United States)

    Li, Xuan; Li, Dehua

    2015-12-01

    Face recognition as an important biometric identification method, with its friendly, natural, convenient advantages, has obtained more and more attention. This paper intends to research a face recognition system including face detection, feature extraction and face recognition, mainly through researching on related theory and the key technology of various preprocessing methods in face detection process, using KPCA method, focuses on the different recognition results in different preprocessing methods. In this paper, we choose YCbCr color space for skin segmentation and choose integral projection for face location. We use erosion and dilation of the opening and closing operation and illumination compensation method to preprocess face images, and then use the face recognition method based on kernel principal component analysis method for analysis and research, and the experiments were carried out using the typical face database. The algorithms experiment on MATLAB platform. Experimental results show that integration of the kernel method based on PCA algorithm under certain conditions make the extracted features represent the original image information better for using nonlinear feature extraction method, which can obtain higher recognition rate. In the image preprocessing stage, we found that images under various operations may appear different results, so as to obtain different recognition rate in recognition stage. At the same time, in the process of the kernel principal component analysis, the value of the power of the polynomial function can affect the recognition result.

  2. An Effective Measured Data Preprocessing Method in Electrical Impedance Tomography

    Directory of Open Access Journals (Sweden)

    Chenglong Yu

    2014-01-01

    Full Text Available As an advanced process detection technology, electrical impedance tomography (EIT has widely been paid attention to and studied in the industrial fields. But the EIT techniques are greatly limited to the low spatial resolutions. This problem may result from the incorrect preprocessing of measuring data and lack of general criterion to evaluate different preprocessing processes. In this paper, an EIT data preprocessing method is proposed by all rooting measured data and evaluated by two constructed indexes based on all rooted EIT measured data. By finding the optimums of the two indexes, the proposed method can be applied to improve the EIT imaging spatial resolutions. In terms of a theoretical model, the optimal rooting times of the two indexes range in [0.23, 0.33] and in [0.22, 0.35], respectively. Moreover, these factors that affect the correctness of the proposed method are generally analyzed. The measuring data preprocessing is necessary and helpful for any imaging process. Thus, the proposed method can be generally and widely used in any imaging process. Experimental results validate the two proposed indexes.

  3. Preprocessing of A-scan GPR data based on energy features

    Science.gov (United States)

    Dogan, Mesut; Turhan-Sayan, Gonul

    2016-05-01

    There is an increasing demand for noninvasive real-time detection and classification of buried objects in various civil and military applications. The problem of detection and annihilation of landmines is particularly important due to strong safety concerns. The requirement for a fast real-time decision process is as important as the requirements for high detection rates and low false alarm rates. In this paper, we introduce and demonstrate a computationally simple, timeefficient, energy-based preprocessing approach that can be used in ground penetrating radar (GPR) applications to eliminate reflections from the air-ground boundary and to locate the buried objects, simultaneously, at one easy step. The instantaneous power signals, the total energy values and the cumulative energy curves are extracted from the A-scan GPR data. The cumulative energy curves, in particular, are shown to be useful to detect the presence and location of buried objects in a fast and simple way while preserving the spectral content of the original A-scan data for further steps of physics-based target classification. The proposed method is demonstrated using the GPR data collected at the facilities of IPA Defense, Ankara at outdoor test lanes. Cylindrically shaped plastic containers were buried in fine-medium sand to simulate buried landmines. These plastic containers were half-filled by ammonium nitrate including metal pins. Results of this pilot study are demonstrated to be highly promising to motivate further research for the use of energy-based preprocessing features in landmine detection problem.

  4. Federal Aviation Administration retained savings program proposal

    International Nuclear Information System (INIS)

    Hostick, D.J.; Larson, L.L.; Hostick, C.J.

    1998-03-01

    Federal legislation allows federal agencies to retain up to 50% of the savings associated with implementing energy efficiency and water conservation measures and practices. Given budget pressures to reduce expenditures, the use of retained savings to fund additional projects represents a source of funds outside of the traditional budget cycle. The Southwest Region Federal Aviation Administration (FAA) has tasked Pacific Northwest National Laboratory (PNNL) to develop a model retained savings program for Southwest Region FAA use and as a prototype for consideration by the FAA. PNNL recommends the following steps be taken in developing a Southwest Region FAA retained savings program: Establish a retained savings mechanism. Determine the level at which the retained savings should be consolidated into a fund. The preliminary recommendation is to establish a revolving efficiency loan fund at the regional level. Such a mechanism allows some consolidation of savings to fund larger projects, while maintaining a sense of facility ownership in that the funds will remain within the region

  5. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms.

    Science.gov (United States)

    Sundareshan, Malur K; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-10

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the

  6. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  7. Linguistic Preprocessing and Tagging for Problem Report Trend Analysis

    Science.gov (United States)

    Beil, Robert J.; Malin, Jane T.

    2012-01-01

    Mr. Robert Beil, Systems Engineer at Kennedy Space Center (KSC), requested the NASA Engineering and Safety Center (NESC) develop a prototype tool suite that combines complementary software technology used at Johnson Space Center (JSC) and KSC for problem report preprocessing and semantic tag extraction, to improve input to data mining and trend analysis. This document contains the outcome of the assessment and the Findings, Observations and NESC Recommendations.

  8. Learning and Generalisation in Neural Networks with Local Preprocessing

    OpenAIRE

    Kutsia, Merab

    2007-01-01

    We study learning and generalisation ability of a specific two-layer feed-forward neural network and compare its properties to that of a simple perceptron. The input patterns are mapped nonlinearly onto a hidden layer, much larger than the input layer, and this mapping is either fixed or may result from an unsupervised learning process. Such preprocessing of initially uncorrelated random patterns results in the correlated patterns in the hidden layer. The hidden-to-output mapping of the net...

  9. Summary of ENDF/B pre-processing codes

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1981-12-01

    This document contains the summary documentation for the ENDF/B pre-processing codes: LINEAR, RECENT, SIGMA1, GROUPIE, EVALPLOT, MERGER, DICTION, CONVERT. This summary documentation is merely a copy of the comment cards that appear at the beginning of each programme; these comment cards always reflect the latest status of input options, etc. For the latest published documentation on the methods used in these codes see UCRL-50400, Vol.17 parts A-E, Lawrence Livermore Laboratory (1979)

  10. Pre-processing by data augmentation for improved ellipse fitting.

    Science.gov (United States)

    Kumar, Pankaj; Belchamber, Erika R; Miklavcic, Stanley J

    2018-01-01

    Ellipse fitting is a highly researched and mature topic. Surprisingly, however, no existing method has thus far considered the data point eccentricity in its ellipse fitting procedure. Here, we introduce the concept of eccentricity of a data point, in analogy with the idea of ellipse eccentricity. We then show empirically that, irrespective of ellipse fitting method used, the root mean square error (RMSE) of a fit increases with the eccentricity of the data point set. The main contribution of the paper is based on the hypothesis that if the data point set were pre-processed to strategically add additional data points in regions of high eccentricity, then the quality of a fit could be improved. Conditional validity of this hypothesis is demonstrated mathematically using a model scenario. Based on this confirmation we propose an algorithm that pre-processes the data so that data points with high eccentricity are replicated. The improvement of ellipse fitting is then demonstrated empirically in real-world application of 3D reconstruction of a plant root system for phenotypic analysis. The degree of improvement for different underlying ellipse fitting methods as a function of data noise level is also analysed. We show that almost every method tested, irrespective of whether it minimizes algebraic error or geometric error, shows improvement in the fit following data augmentation using the proposed pre-processing algorithm.

  11. A Stereo Music Preprocessing Scheme for Cochlear Implant Users.

    Science.gov (United States)

    Buyens, Wim; van Dijk, Bas; Wouters, Jan; Moonen, Marc

    2015-10-01

    Listening to music is still one of the more challenging aspects of using a cochlear implant (CI) for most users. Simple musical structures, a clear rhythm/beat, and lyrics that are easy to follow are among the top factors contributing to music appreciation for CI users. Modifying the audio mix of complex music potentially improves music enjoyment in CI users. A stereo music preprocessing scheme is described in which vocals, drums, and bass are emphasized based on the representation of the harmonic and the percussive components in the input spectrogram, combined with the spatial allocation of instruments in typical stereo recordings. The scheme is assessed with postlingually deafened CI subjects (N = 7) using pop/rock music excerpts with different complexity levels. The scheme is capable of modifying relative instrument level settings, with the aim of improving music appreciation in CI users, and allows individual preference adjustments. The assessment with CI subjects confirms the preference for more emphasis on vocals, drums, and bass as offered by the preprocessing scheme, especially for songs with higher complexity. The stereo music preprocessing scheme has the potential to improve music enjoyment in CI users by modifying the audio mix in widespread (stereo) music recordings. Since music enjoyment in CI users is generally poor, this scheme can assist the music listening experience of CI users as a training or rehabilitation tool.

  12. ITSG-Grace2016 data preprocessing methodologies revisited: impact of using Level-1A data products

    Science.gov (United States)

    Klinger, Beate; Mayer-Gürr, Torsten

    2017-04-01

    For the ITSG-Grace2016 release, the gravity field recovery is based on the use of official GRACE (Gravity Recovery and Climate Experiment) Level-1B data products, generated by the Jet Propulsion Laboratory (JPL). Before gravity field recovery, the Level-1B instrument data are preprocessed. This data preprocessing step includes the combination of Level-1B star camera (SCA1B) and angular acceleration (ACC1B) data for an improved attitude determination (sensor fusion), instrument data screening and ACC1B data calibration. Based on a Level-1A test dataset, provided for individual month throughout the GRACE period by the Center of Space Research at the University of Texas at Austin (UTCSR), the impact of using Level-1A instead of Level-1B data products within the ITSG-Grace2016 processing chain is analyzed. We discuss (1) the attitude determination through an optimal combination of SCA1A and ACC1A data using our sensor fusion approach, (2) the impact of the new attitude product on temporal gravity field solutions, and (3) possible benefits of using Level-1A data for instrument data screening and calibration. As the GRACE mission is currently reaching its end-of-life, the presented work aims not only at a better understanding of GRACE science data to reduce the impact of possible error sources on the gravity field recovery, but it also aims at preparing Level-1A data handling capabilities for the GRACE Follow-On mission.

  13. Applying Enhancement Filters in the Pre-processing of Images of Lymphoma

    International Nuclear Information System (INIS)

    Silva, Sérgio Henrique; Do Nascimento, Marcelo Zanchetta; Neves, Leandro Alves; Batista, Valério Ramos

    2015-01-01

    Lymphoma is a type of cancer that affects the immune system, and is classified as Hodgkin or non-Hodgkin. It is one of the ten types of cancer that are the most common on earth. Among all malignant neoplasms diagnosed in the world, lymphoma ranges from three to four percent of them. Our work presents a study of some filters devoted to enhancing images of lymphoma at the pre-processing step. Here the enhancement is useful for removing noise from the digital images. We have analysed the noise caused by different sources like room vibration, scraps and defocusing, and in the following classes of lymphoma: follicular, mantle cell and B-cell chronic lymphocytic leukemia. The filters Gaussian, Median and Mean-Shift were applied to different colour models (RGB, Lab and HSV). Afterwards, we performed a quantitative analysis of the images by means of the Structural Similarity Index. This was done in order to evaluate the similarity between the images. In all cases we have obtained a certainty of at least 75%, which rises to 99% if one considers only HSV. Namely, we have concluded that HSV is an important choice of colour model at pre-processing histological images of lymphoma, because in this case the resulting image will get the best enhancement

  14. Change detection using landsat time series: A review of frequencies, preprocessing, algorithms, and applications

    Science.gov (United States)

    Zhu, Zhe

    2017-08-01

    The free and open access to all archived Landsat images in 2008 has completely changed the way of using Landsat data. Many novel change detection algorithms based on Landsat time series have been developed We present a comprehensive review of four important aspects of change detection studies based on Landsat time series, including frequencies, preprocessing, algorithms, and applications. We observed the trend that the more recent the study, the higher the frequency of Landsat time series used. We reviewed a series of image preprocessing steps, including atmospheric correction, cloud and cloud shadow detection, and composite/fusion/metrics techniques. We divided all change detection algorithms into six categories, including thresholding, differencing, segmentation, trajectory classification, statistical boundary, and regression. Within each category, six major characteristics of different algorithms, such as frequency, change index, univariate/multivariate, online/offline, abrupt/gradual change, and sub-pixel/pixel/spatial were analyzed. Moreover, some of the widely-used change detection algorithms were also discussed. Finally, we reviewed different change detection applications by dividing these applications into two categories, change target and change agent detection.

  15. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    Science.gov (United States)

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  16. Impact of functional MRI data preprocessing pipeline on default-mode network detectability in patients with disorders of consciousness

    Directory of Open Access Journals (Sweden)

    Adrian eAndronache

    2013-08-01

    Full Text Available An emerging application of resting-state functional MRI is the study of patients with disorders of consciousness (DoC, where integrity of default-mode network (DMN activity is associated to the clinical level of preservation of consciousness. Due to the inherent inability to follow verbal instructions, arousal induced by scanning noise and postural pain, these patients tend to exhibit substantial levels of movement. This results in spurious, non-neural fluctuations of the blood-oxygen level-dependent (BOLD signal, which impair the evaluation of residual functional connectivity. Here, the effect of data preprocessing choices on the detectability of the DMN was systematically evaluated in a representative cohort of 30 clinically and etiologically heterogeneous DoC patients and 33 healthy controls. Starting from a standard preprocessing pipeline, additional steps were gradually inserted, namely band-pass filtering, removal of co-variance with the movement vectors, removal of co-variance with the global brain parenchyma signal, rejection of realignment outlier volumes and ventricle masking. Both independent-component analysis (ICA and seed-based analysis (SBA were performed, and DMN detectability was assessed quantitatively as well as visually. The results of the present study strongly show that the detection of DMN activity in the sub-optimal fMRI series acquired on DoC patients is contingent on the use of adequate filtering steps. ICA and SBA are differently affected but give convergent findings for high-grade preprocessing. We propose that future studies in this area should adopt the described preprocessing procedures as a minimum standard to reduce the probability of wrongly inferring that DMN activity is absent.

  17. Unmixing-Based Denoising as a Pre-Processing Step for Coral Reef Analysis

    Science.gov (United States)

    Cerra, D.; Traganos, D.; Gege, P.; Reinartz, P.

    2017-05-01

    Coral reefs, among the world's most biodiverse and productive submerged habitats, have faced several mass bleaching events due to climate change during the past 35 years. In the course of this century, global warming and ocean acidification are expected to cause corals to become increasingly rare on reef systems. This will result in a sharp decrease in the biodiversity of reef communities and carbonate reef structures. Coral reefs may be mapped, characterized and monitored through remote sensing. Hyperspectral images in particular excel in being used in coral monitoring, being characterized by very rich spectral information, which results in a strong discrimination power to characterize a target of interest, and separate healthy corals from bleached ones. Being submerged habitats, coral reef systems are difficult to analyse in airborne or satellite images, as relevant information is conveyed in bands in the blue range which exhibit lower signal-to-noise ratio (SNR) with respect to other spectral ranges; furthermore, water is absorbing most of the incident solar radiation, further decreasing the SNR. Derivative features, which are important in coral analysis, result greatly affected by the resulting noise present in relevant spectral bands, justifying the need of new denoising techniques able to keep local spatial and spectral features. In this paper, Unmixing-based Denoising (UBD) is used to enable analysis of a hyperspectral image acquired over a coral reef system in the Red Sea based on derivative features. UBD reconstructs pixelwise a dataset with reduced noise effects, by forcing each spectrum to a linear combination of other reference spectra, exploiting the high dimensionality of hyperspectral datasets. Results show clear enhancements with respect to traditional denoising methods based on spatial and spectral smoothing, facilitating the coral detection task.

  18. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    International Nuclear Information System (INIS)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-01-01

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  19. Simultaneous data pre-processing and SVM classification model selection based on a parallel genetic algorithm applied to spectroscopic data of olive oils.

    Science.gov (United States)

    Devos, Olivier; Downey, Gerard; Duponchel, Ludovic

    2014-04-01

    Classification is an important task in chemometrics. For several years now, support vector machines (SVMs) have proven to be powerful for infrared spectral data classification. However such methods require optimisation of parameters in order to control the risk of overfitting and the complexity of the boundary. Furthermore, it is established that the prediction ability of classification models can be improved using pre-processing in order to remove unwanted variance in the spectra. In this paper we propose a new methodology based on genetic algorithm (GA) for the simultaneous optimisation of SVM parameters and pre-processing (GENOPT-SVM). The method has been tested for the discrimination of the geographical origin of Italian olive oil (Ligurian and non-Ligurian) on the basis of near infrared (NIR) or mid infrared (FTIR) spectra. Different classification models (PLS-DA, SVM with mean centre data, GENOPT-SVM) have been tested and statistically compared using McNemar's statistical test. For the two datasets, SVM with optimised pre-processing give models with higher accuracy than the one obtained with PLS-DA on pre-processed data. In the case of the NIR dataset, most of this accuracy improvement (86.3% compared with 82.8% for PLS-DA) occurred using only a single pre-processing step. For the FTIR dataset, three optimised pre-processing steps are required to obtain SVM model with significant accuracy improvement (82.2%) compared to the one obtained with PLS-DA (78.6%). Furthermore, this study demonstrates that even SVM models have to be developed on the basis of well-corrected spectral data in order to obtain higher classification rates. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. A data preprocessing strategy for metabolomics to reduce the mask effect in data analysis

    Directory of Open Access Journals (Sweden)

    Jun eYang

    2015-02-01

    Full Text Available Metabolomics is a booming research field. Its success highly relies on the discovery of differential metabolites by comparing different data sets (for example, patients vs. controls. One of the challenges is that differences of the low abundant metabolites between groups are often masked by the high variation of abundant metabolites -. In order to solve this challenge, a novel data preprocessing strategy consisting of 3 steps was proposed in this study. In step 1, a ‘modified 80%’ rule was used to reduce effect of missing values; in step 2, unit-variance and Pareto scaling methods were used to reduce the mask effect from the abundant metabolites. In step 3, in order to fix the adverse effect of scaling, stability information of the variables deduced from intensity information and the class information, was used to assign suitable weights to the variables. When applying to an LC/MS based metabolomics dataset from chronic hepatitis B patients study and two simulated datasets, the mask effect was found to be partially eliminated and several new low abundant differential metabolites were rescued.

  1. Parallel pipeline algorithm of real time star map preprocessing

    Science.gov (United States)

    Wang, Hai-yong; Qin, Tian-mu; Liu, Jia-qi; Li, Zhi-feng; Li, Jian-hua

    2016-03-01

    To improve the preprocessing speed of star map and reduce the resource consumption of embedded system of star tracker, a parallel pipeline real-time preprocessing algorithm is presented. The two characteristics, the mean and the noise standard deviation of the background gray of a star map, are firstly obtained dynamically by the means that the intervene of the star image itself to the background is removed in advance. The criterion on whether or not the following noise filtering is needed is established, then the extraction threshold value is assigned according to the level of background noise, so that the centroiding accuracy is guaranteed. In the processing algorithm, as low as two lines of pixel data are buffered, and only 100 shift registers are used to record the connected domain label, by which the problems of resources wasting and connected domain overflow are solved. The simulating results show that the necessary data of the selected bright stars could be immediately accessed in a delay time as short as 10us after the pipeline processing of a 496×496 star map in 50Mb/s is finished, and the needed memory and registers resource total less than 80kb. To verify the accuracy performance of the algorithm proposed, different levels of background noise are added to the processed ideal star map, and the statistic centroiding error is smaller than 1/23 pixel under the condition that the signal to noise ratio is greater than 1. The parallel pipeline algorithm of real time star map preprocessing helps to increase the data output speed and the anti-dynamic performance of star tracker.

  2. Contour extraction of echocardiographic images based on pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana [Department of Multimedia, Faculty of Computer Science and Information Technology, Department of Computer and Communication Systems Engineering, Faculty of Engineering University Putra Malaysia 43400 Serdang, Selangor (Malaysia); Zamrin, D M [Department of Surgery, Faculty of Medicine, National University of Malaysia, 56000 Cheras, Kuala Lumpur (Malaysia); Saripan, M Iqbal

    2011-02-15

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  3. Contour extraction of echocardiographic images based on pre-processing

    International Nuclear Information System (INIS)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana; Zamrin, D M; Saripan, M Iqbal

    2011-01-01

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  4. Parallel preprocessing in a nuclear data acquisition system

    International Nuclear Information System (INIS)

    Pichot, G.; Auriol, E.; Lemarchand, G.; Millaud, J.

    1977-01-01

    The appearance of microprocessors and large memory chips has somewhat modified the spectrum of tools usable by the data acquisition system designer. This is particular true in the nuclear research field where the data flow has been continuously growing as a consequence of the increasing capabilities of new detectors. This paper deals with the insertion, between a data acquisition system and a computer, of a preprocessing structure based on microprocessors and large capacity high speed memories. The results shows a significant improvement on several aspects in the operation of the system with returns paying back the investments in 18 months

  5. Private Sector Savings

    Directory of Open Access Journals (Sweden)

    Pitonáková Renáta

    2018-03-01

    Full Text Available The majority of household savings are in the form of bank deposits. It is therefore of interest for credit institutions to tailor their deposit policy for getting finances from non-banking entities and to provide the private sector with the loans that are necessary for investment activities and consumption. This paper deals with the determinants of the saving rate of the private sector of Slovakia. Economic, financial and demographic variables influence savings. Growth of income per capita, private disposable income, elderly dependency ratio, real interest rate and inflation have a positive impact on savings, while increases in public savings indicate a crowding out effect. The inflation rate implies precautionary savings, and dependency ratio savings for bequest. There are also implications for governing institutions deciding on the implementation of appropriate fiscal and monetary operations.

  6. A base composition analysis of natural patterns for the preprocessing of metagenome sequences.

    Science.gov (United States)

    Bonham-Carter, Oliver; Ali, Hesham; Bastola, Dhundy

    2013-01-01

    On the pretext that sequence reads and contigs often exhibit the same kinds of base usage that is also observed in the sequences from which they are derived, we offer a base composition analysis tool. Our tool uses these natural patterns to determine relatedness across sequence data. We introduce spectrum sets (sets of motifs) which are permutations of bacterial restriction sites and the base composition analysis framework to measure their proportional content in sequence data. We suggest that this framework will increase the efficiency during the pre-processing stages of metagenome sequencing and assembly projects. Our method is able to differentiate organisms and their reads or contigs. The framework shows how to successfully determine the relatedness between these reads or contigs by comparison of base composition. In particular, we show that two types of organismal-sequence data are fundamentally different by analyzing their spectrum set motif proportions (coverage). By the application of one of the four possible spectrum sets, encompassing all known restriction sites, we provide the evidence to claim that each set has a different ability to differentiate sequence data. Furthermore, we show that the spectrum set selection having relevance to one organism, but not to the others of the data set, will greatly improve performance of sequence differentiation even if the fragment size of the read, contig or sequence is not lengthy. We show the proof of concept of our method by its application to ten trials of two or three freshly selected sequence fragments (reads and contigs) for each experiment across the six organisms of our set. Here we describe a novel and computationally effective pre-processing step for metagenome sequencing and assembly tasks. Furthermore, our base composition method has applications in phylogeny where it can be used to infer evolutionary distances between organisms based on the notion that related organisms often have much conserved code.

  7. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  8. Preprocessing in a Tiered Sensor Network for Habitat Monitoring

    Directory of Open Access Journals (Sweden)

    Hanbiao Wang

    2003-03-01

    Full Text Available We investigate task decomposition and collaboration in a two-tiered sensor network for habitat monitoring. The system recognizes and localizes a specified type of birdcalls. The system has a few powerful macronodes in the first tier, and many less powerful micronodes in the second tier. Each macronode combines data collected by multiple micronodes for target classification and localization. We describe two types of lightweight preprocessing which significantly reduce data transmission from micronodes to macronodes. Micronodes classify events according to their cross-zero rates and discard irrelevant events. Data about events of interest is reduced and compressed before being transmitted to macronodes for target localization. Preliminary experiments illustrate the effectiveness of event filtering and data reduction at micronodes.

  9. Piecewise Polynomial Aggregation as Preprocessing for Data Numerical Modeling

    Science.gov (United States)

    Dobronets, B. S.; Popova, O. A.

    2018-05-01

    Data aggregation issues for numerical modeling are reviewed in the present study. The authors discuss data aggregation procedures as preprocessing for subsequent numerical modeling. To calculate the data aggregation, the authors propose using numerical probabilistic analysis (NPA). An important feature of this study is how the authors represent the aggregated data. The study shows that the offered approach to data aggregation can be interpreted as the frequency distribution of a variable. To study its properties, the density function is used. For this purpose, the authors propose using the piecewise polynomial models. A suitable example of such approach is the spline. The authors show that their approach to data aggregation allows reducing the level of data uncertainty and significantly increasing the efficiency of numerical calculations. To demonstrate the degree of the correspondence of the proposed methods to reality, the authors developed a theoretical framework and considered numerical examples devoted to time series aggregation.

  10. Pre-processing of input files for the AZTRAN code

    International Nuclear Information System (INIS)

    Vargas E, S.; Ibarra, G.

    2017-09-01

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  11. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  12. Digital soil mapping: strategy for data pre-processing

    Directory of Open Access Journals (Sweden)

    Alexandre ten Caten

    2012-08-01

    Full Text Available The region of greatest variability on soil maps is along the edge of their polygons, causing disagreement among pedologists about the appropriate description of soil classes at these locations. The objective of this work was to propose a strategy for data pre-processing applied to digital soil mapping (DSM. Soil polygons on a training map were shrunk by 100 and 160 m. This strategy prevented the use of covariates located near the edge of the soil classes for the Decision Tree (DT models. Three DT models derived from eight predictive covariates, related to relief and organism factors sampled on the original polygons of a soil map and on polygons shrunk by 100 and 160 m were used to predict soil classes. The DT model derived from observations 160 m away from the edge of the polygons on the original map is less complex and has a better predictive performance.

  13. Energy saving certificates

    International Nuclear Information System (INIS)

    2005-11-01

    The French ministry of economy, finances and industry and the French agency of environment and energy mastery (Ademe) have organized on November 8, 2005, a colloquium for the presentation of the energy saving certificates, a new tool to oblige the energy suppliers to encourage their clients to make energy savings. This document gathers the transparencies presented at this colloquium about the following topics: state-of-the-art and presentation of the energy saving certificates system: presentation of the EEC system, presentation of the EEC standard operations; the energy saving certificates in Europe today: energy efficiency commitment in UK, Italian white certificate scheme, perspectives of the different European systems. (J.S.)

  14. Spending to save

    DEFF Research Database (Denmark)

    Larsen, Anders

    2013-01-01

    the energy distribution companies meet their overall saving obligation, the net savings impact are about a third of the savings reported by the obligated parties. Further it was found that while energy savings in the public and business sector have a high net impact, some subsidies given under the EEO...... perspective. The evaluation has resulted in noticeable adjustments of the design of the Danish EEO, e.g. introduction of a 1 year payback-time limit for projects receiving subsidies, a minimum baseline for insulation products, and specification of documentation requirements....

  15. A new approach to pre-processing digital image for wavelet-based watermark

    Science.gov (United States)

    Agreste, Santa; Andaloro, Guido

    2008-11-01

    The growth of the Internet has increased the phenomenon of digital piracy, in multimedia objects, like software, image, video, audio and text. Therefore it is strategic to individualize and to develop methods and numerical algorithms, which are stable and have low computational cost, that will allow us to find a solution to these problems. We describe a digital watermarking algorithm for color image protection and authenticity: robust, not blind, and wavelet-based. The use of Discrete Wavelet Transform is motivated by good time-frequency features and a good match with Human Visual System directives. These two combined elements are important for building an invisible and robust watermark. Moreover our algorithm can work with any image, thanks to the step of pre-processing of the image that includes resize techniques that adapt to the size of the original image for Wavelet transform. The watermark signal is calculated in correlation with the image features and statistic properties. In the detection step we apply a re-synchronization between the original and watermarked image according to the Neyman-Pearson statistic criterion. Experimentation on a large set of different images has been shown to be resistant against geometric, filtering, and StirMark attacks with a low rate of false alarm.

  16. Energy Savings Lifetimes and Persistence

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, Ian M. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schiller, Steven R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Todd, Annika [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Billingsley, Megan A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schwartz, Lisa C. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-02-01

    This technical brief explains the concepts of energy savings lifetimes and savings persistence and discusses how program administrators use these factors to calculate savings for efficiency measures, programs and portfolios. Savings lifetime is the length of time that one or more energy efficiency measures or activities save energy, and savings persistence is the change in savings throughout the functional life of a given efficiency measure or activity. Savings lifetimes are essential for assessing the lifecycle benefits and cost effectiveness of efficiency activities and for forecasting loads in resource planning. The brief also provides estimates of savings lifetimes derived from a national collection of costs and savings for electric efficiency programs and portfolios.

  17. Contract saving schemes

    NARCIS (Netherlands)

    Ronald, R.; Smith, S.J.; Elsinga, M.; Eng, O.S.; Fox O'Mahony, L.; Wachter, S.

    2012-01-01

    Contractual saving schemes for housing are institutionalised savings programmes normally linked to rights to loans for home purchase. They are diverse types as they have been developed differently in each national context, but normally fall into categories of open, closed, compulsory, and ‘free

  18. Measuring industrial energy savings

    International Nuclear Information System (INIS)

    Kelly Kissock, J.; Eger, Carl

    2008-01-01

    Accurate measurement of energy savings from industrial energy efficiency projects can reduce uncertainty about the efficacy of the projects, guide the selection of future projects, improve future estimates of expected savings, promote financing of energy efficiency projects through shared-savings agreements, and improve utilization of capital resources. Many efforts to measure industrial energy savings, or simply track progress toward efficiency goals, have had difficulty incorporating changing weather and production, which are frequently major drivers of plant energy use. This paper presents a general method for measuring plant-wide industrial energy savings that takes into account changing weather and production between the pre and post-retrofit periods. In addition, the method can disaggregate savings into components, which provides additional resolution for understanding the effectiveness of individual projects when several projects are implemented together. The method uses multivariable piece-wise regression models to characterize baseline energy use, and disaggregates savings by taking the total derivative of the energy use equation. Although the method incorporates search techniques, multi-variable least-squares regression and calculus, it is easily implemented using data analysis software, and can use readily available temperature, production and utility billing data. This is important, since more complicated methods may be too complex for widespread use. The method is demonstrated using case studies of actual energy assessments. The case studies demonstrate the importance of adjusting for weather and production between the pre- and post-retrofit periods, how plant-wide savings can be disaggregated to evaluate the effectiveness of individual retrofits, how the method can identify the time-dependence of savings, and limitations of engineering models when used to estimate future savings

  19. Ensemble preprocessing of near-infrared (NIR) spectra for multivariate calibration

    International Nuclear Information System (INIS)

    Xu Lu; Zhou Yanping; Tang Lijuan; Wu Hailong; Jiang Jianhui; Shen Guoli; Yu Ruqin

    2008-01-01

    Preprocessing of raw near-infrared (NIR) spectral data is indispensable in multivariate calibration when the measured spectra are subject to significant noises, baselines and other undesirable factors. However, due to the lack of sufficient prior information and an incomplete knowledge of the raw data, NIR spectra preprocessing in multivariate calibration is still trial and error. How to select a proper method depends largely on both the nature of the data and the expertise and experience of the practitioners. This might limit the applications of multivariate calibration in many fields, where researchers are not very familiar with the characteristics of many preprocessing methods unique in chemometrics and have difficulties to select the most suitable methods. Another problem is many preprocessing methods, when used alone, might degrade the data in certain aspects or lose some useful information while improving certain qualities of the data. In order to tackle these problems, this paper proposes a new concept of data preprocessing, ensemble preprocessing method, where partial least squares (PLSs) models built on differently preprocessed data are combined by Monte Carlo cross validation (MCCV) stacked regression. Little or no prior information of the data and expertise are required. Moreover, fusion of complementary information obtained by different preprocessing methods often leads to a more stable and accurate calibration model. The investigation of two real data sets has demonstrated the advantages of the proposed method

  20. Software for Preprocessing Data from Rocket-Engine Tests

    Science.gov (United States)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  1. Zseq: An Approach for Preprocessing Next-Generation Sequencing Data.

    Science.gov (United States)

    Alkhateeb, Abedalrhman; Rueda, Luis

    2017-08-01

    Next-generation sequencing technology generates a huge number of reads (short sequences), which contain a vast amount of genomic data. The sequencing process, however, comes with artifacts. Preprocessing of sequences is mandatory for further downstream analysis. We present Zseq, a linear method that identifies the most informative genomic sequences and reduces the number of biased sequences, sequence duplications, and ambiguous nucleotides. Zseq finds the complexity of the sequences by counting the number of unique k-mers in each sequence as its corresponding score and also takes into the account other factors such as ambiguous nucleotides or high GC-content percentage in k-mers. Based on a z-score threshold, Zseq sweeps through the sequences again and filters those with a z-score less than the user-defined threshold. Zseq algorithm is able to provide a better mapping rate; it reduces the number of ambiguous bases significantly in comparison with other methods. Evaluation of the filtered reads has been conducted by aligning the reads and assembling the transcripts using the reference genome as well as de novo assembly. The assembled transcripts show a better discriminative ability to separate cancer and normal samples in comparison with another state-of-the-art method. Moreover, de novo assembled transcripts from the reads filtered by Zseq have longer genomic sequences than other tested methods. Estimating the threshold of the cutoff point is introduced using labeling rules with optimistic results.

  2. Road Sign Recognition with Fuzzy Adaptive Pre-Processing Models

    Science.gov (United States)

    Lin, Chien-Chuan; Wang, Ming-Shi

    2012-01-01

    A road sign recognition system based on adaptive image pre-processing models using two fuzzy inference schemes has been proposed. The first fuzzy inference scheme is to check the changes of the light illumination and rich red color of a frame image by the checking areas. The other is to check the variance of vehicle's speed and angle of steering wheel to select an adaptive size and position of the detection area. The Adaboost classifier was employed to detect the road sign candidates from an image and the support vector machine technique was employed to recognize the content of the road sign candidates. The prohibitory and warning road traffic signs are the processing targets in this research. The detection rate in the detection phase is 97.42%. In the recognition phase, the recognition rate is 93.04%. The total accuracy rate of the system is 92.47%. For video sequences, the best accuracy rate is 90.54%, and the average accuracy rate is 80.17%. The average computing time is 51.86 milliseconds per frame. The proposed system can not only overcome low illumination and rich red color around the road sign problems but also offer high detection rates and high computing performance. PMID:22778650

  3. Neural Online Filtering Based on Preprocessed Calorimeter Data

    CERN Document Server

    Torres, R C; The ATLAS collaboration; Simas Filho, E F; De Seixas, J M

    2009-01-01

    Among LHC detectors, ATLAS aims at coping with such high event rate by designing a three-level online triggering system. The first level trigger output will be ~75 kHz. This level will mark the regions where relevant events were found. The second level will validate LVL1 decision by looking only at the approved data using full granularity. At the level two output, the event rate will be reduced to ~2 kHz. Finally, the third level will look at full event information and a rate of ~200 Hz events is expected to be approved, and stored in persistent media for further offline analysis. Many interesting events decay into electrons, which have to be identified from the huge background noise (jets). This work proposes a high-efficient LVL2 electron / jet discrimination system based on neural networks fed from preprocessed calorimeter information. The feature extraction part of the proposed system performs a ring structure of data description. A set of concentric rings centered at the highest energy cell is generated ...

  4. Data preprocessing methods for robust Fourier ptychographic microscopy

    Science.gov (United States)

    Zhang, Yan; Pan, An; Lei, Ming; Yao, Baoli

    2017-12-01

    Fourier ptychographic microscopy (FPM) is a recently developed computational imaging technique that achieves gigapixel images with both high resolution and large field-of-view. In the current FPM experimental setup, the dark-field images with high-angle illuminations are easily overwhelmed by stray lights and background noises due to the low signal-to-noise ratio, thus significantly degrading the achievable resolution of the FPM approach. We provide an overall and systematic data preprocessing scheme to enhance the FPM's performance, which involves sampling analysis, underexposed/overexposed treatments, background noises suppression, and stray lights elimination. It is demonstrated experimentally with both US Air Force (USAF) 1951 resolution target and biological samples that the benefit of the noise removal by these methods far outweighs the defect of the accompanying signal loss, as part of the lost signals can be compensated by the improved consistencies among the captured raw images. In addition, the reported nonparametric scheme could be further cooperated with the existing state-of-the-art algorithms with a great flexibility, facilitating a stronger noise-robust capability of the FPM approach in various applications.

  5. Arabic text preprocessing for the natural language processing applications

    International Nuclear Information System (INIS)

    Awajan, A.

    2007-01-01

    A new approach for processing vowelized and unvowelized Arabic texts in order to prepare them for Natural Language Processing (NLP) purposes is described. The developed approach is rule-based and made up of four phases: text tokenization, word light stemming, word's morphological analysis and text annotation. The first phase preprocesses the input text in order to isolate the words and represent them in a formal way. The second phase applies a light stemmer in order to extract the stem of each word by eliminating the prefixes and suffixes. The third phase is a rule-based morphological analyzer that determines the root and the morphological pattern for each extracted stem. The last phase produces an annotated text where each word is tagged with its morphological attributes. The preprocessor presented in this paper is capable of dealing with vowelized and unvowelized words, and provides the input words along with relevant linguistics information needed by different applications. It is designed to be used with different NLP applications such as machine translation text summarization, text correction, information retrieval and automatic vowelization of Arabic Text. (author)

  6. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  7. ASAP: an environment for automated preprocessing of sequencing data.

    Science.gov (United States)

    Torstenson, Eric S; Li, Bingshan; Li, Chun

    2013-01-04

    Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  8. ASAP: an environment for automated preprocessing of sequencing data

    Science.gov (United States)

    2013-01-01

    Background Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP. PMID:23289815

  9. Joint Preprocesser-Based Detectors for One-Way and Two-Way Cooperative Communication Networks

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2014-05-01

    Efficient receiver designs for cooperative communication networks are becoming increasingly important. In previous work, cooperative networks communicated with the use of L relays. As the receiver is constrained, channel shortening and reduced-rank techniques were employed to design the preprocessing matrix that reduces the length of the received vector from L to U. In the first part of the work, a receiver structure is proposed which combines our proposed threshold selection criteria with the joint iterative optimization (JIO) algorithm that is based on the mean square error (MSE). Our receiver assists in determining the optimal U. Furthermore, this receiver provides the freedom to choose U for each frame depending on the tolerable difference allowed for MSE. Our study and simulation results show that by choosing an appropriate threshold, it is possible to gain in terms of complexity savings while having no or minimal effect on the BER performance of the system. Furthermore, the effect of channel estimation on the performance of the cooperative system is investigated. In the second part of the work, a joint preprocessor-based detector for cooperative communication networks is proposed for one-way and two-way relaying. This joint preprocessor-based detector operates on the principles of minimizing the symbol error rate (SER) instead of minimizing MSE. For a realistic assessment, pilot symbols are used to estimate the channel. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Finally, our proposed scheme has the lowest computational complexity.

  10. Invisible costs, visible savings.

    Science.gov (United States)

    Lefever, G

    1999-08-01

    By identifying hidden inventory costs, nurse managers can save money for the organization. Some measures include tracking and standardizing supplies, accurately evaluating patients' needs, and making informed purchasing decisions.

  11. Realized Cost Savings 2016

    Data.gov (United States)

    Department of Veterans Affairs — This dataset is provided as a requirement of OMB’s Integrated Data Collection (IDC) and links to VA’s Realized Cost Savings and Avoidances data in JSON format. Cost...

  12. Voice preprocessing system incorporating a real-time spectrum analyzer with programmable switched-capacitor filters

    Science.gov (United States)

    Knapp, G.

    1984-01-01

    As part of a speaker verification program for BISS (Base Installation Security System), a test system is being designed with a flexible preprocessing system for the evaluation of voice spectrum/verification algorithm related problems. The main part of this report covers the design, construction, and testing of a voice analyzer with 16 integrating real-time frequency channels ranging from 300 Hz to 3 KHz. The bandpass filter response of each channel is programmable by NMOS switched capacitor quad filter arrays. Presently, the accuracy of these units is limited to a moderate precision by the finite steps of programming. However, repeatability of characteristics between filter units and sections seems to be excellent for the implemented fourth-order Butterworth bandpass responses. We obtained a 0.1 dB linearity error of signal detection and measured a signal-to-noise ratio of approximately 70 dB. The proprocessing system discussed includes preemphasis filter design, gain normalizer design, and data acquisition system design as well as test results.

  13. Switched Flip-Flop based Preprocessing Circuit for ISFETs

    Directory of Open Access Journals (Sweden)

    Martin Kollár

    2005-03-01

    Full Text Available In this paper, a preprocessing circuit for ISFETs (Ion-sensitive field-effecttransistors to measure hydrogen-ion concentration in electrolyte is presented. A modifiedflip-flop is the main part of the circuit. The modification consists in replacing the standardtransistors by ISFETs and periodically switching the supply voltage on and off.Concentration of hydrogen ions to be measured discontinues the flip-flop value symmetry,which means that by switching the supply voltage on the flip-flop goes to one of two stablestates, ‘one’ or ‘zero’. The recovery of the value symmetry can be achieved by changing abalanced voltage, which is incorporated to the flip-flop, to bring the flip-flop to a 50%position (probability of ‘one’ equals to probability of ‘zero’. Thus, the balanced voltagereflects the measured concentration of hydrogen ions. Its magnitude is set automatically byusing a feedback circuit whose input is connected to the flip-flop output. The preprocessingcircuit, as the whole, is the well-known δ modulator in which the switched flip-flop servesas a comparator and a sampling circuit. The advantages of this approach in comparison tothose of standard approaches are discussed. Finally, theoretical results are verified bysimulations with TSPICE and a good agreement is reported.

  14. Net-Zero Building Technologies Create Substantial Energy Savings -

    Science.gov (United States)

    only an estimated 1% of commercial buildings are built to net-zero energy criteria. One reason for this Continuum Magazine | NREL Net-Zero Building Technologies Create Substantial Energy Savings Net -Zero Building Technologies Create Substantial Energy Savings Researchers work to package and share step

  15. Thinning: A Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper include Thinning method. We also try to analyze the results obtained by the pixel-level processing algorithms.

  16. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    Science.gov (United States)

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  17. Pre-processing of Fourier transform infrared spectra by means of multivariate analysis implemented in the R environment.

    Science.gov (United States)

    Banas, Krzysztof; Banas, Agnieszka; Gajda, Mariusz; Pawlicki, Bohdan; Kwiatek, Wojciech M; Breese, Mark B H

    2015-04-21

    Pre-processing of Fourier transform infrared (FTIR) spectra is typically the first and crucial step in data analysis. Very often hyperspectral datasets include the regions characterized by the spectra of very low intensity, for example two-dimensional (2D) maps where the areas with only support materials (like mylar foil) are present. In that case segmentation of the complete dataset is required before subsequent evaluation. The method proposed in this contribution is based on a multivariate approach (hierarchical cluster analysis), and shows its superiority when compared to the standard method of cutting-off by using only the mean spectral intensity. Both techniques were implemented and their performance was tested in the R statistical environment - open-source platform - that is a favourable solution if the repeatability and transparency are the key aspects.

  18. Sign Up for Savings.

    Science.gov (United States)

    Kennedy, Mike

    2002-01-01

    Discusses performance service contracts between educational facilities and energy services companies, in which the company provides the money for energy-efficiency improvements and the school pays the company an annual fee. The company guarantees the savings will meet or exceed the fee. (EV)

  19. Saving Malta's music memory

    OpenAIRE

    Sant, Toni

    2013-01-01

    Maltese music is being lost. Along with it Malta loses its culture, way of life, and memories. Dr Toni Sant is trying to change this trend through the Malta Music Memory Project (M3P) http://www.um.edu.mt/think/saving-maltas-music-memory-2/

  20. Save Our Water Resources.

    Science.gov (United States)

    Bromley, Albert W.

    The purpose of this booklet, developed as part of Project SOAR (Save Our American Resources), is to give Scout leaders some facts about the world's resources, the sources of water pollution, and how people can help in obtaining solutions. Among the topics discussed are the world's water resources, the water cycle, water quality, sources of water…

  1. Gun control saves lives

    African Journals Online (AJOL)

    gun control legislation. One study estimated that more than 4 500 lives were saved across five SA cities from 2001 to 2005.[5] Pro-gun interest groups seeking to promote gun ownership and diffusion have attacked these findings, suggesting that stricter gun control was only enacted in 2004 following the publication of ...

  2. Detailed Investigation and Comparison of the XCMS and MZmine 2 Chromatogram Construction and Chromatographic Peak Detection Methods for Preprocessing Mass Spectrometry Metabolomics Data.

    Science.gov (United States)

    Myers, Owen D; Sumner, Susan J; Li, Shuzhao; Barnes, Stephen; Du, Xiuxia

    2017-09-05

    XCMS and MZmine 2 are two widely used software packages for preprocessing untargeted LC/MS metabolomics data. Both construct extracted ion chromatograms (EICs) and detect peaks from the EICs, the first two steps in the data preprocessing workflow. While both packages have performed admirably in peak picking, they also detect a problematic number of false positive EIC peaks and can also fail to detect real EIC peaks. The former and latter translate downstream into spurious and missing compounds and present significant limitations with most existing software packages that preprocess untargeted mass spectrometry metabolomics data. We seek to understand the specific reasons why XCMS and MZmine 2 find the false positive EIC peaks that they do and in what ways they fail to detect real compounds. We investigate differences of EIC construction methods in XCMS and MZmine 2 and find several problems in the XCMS centWave peak detection algorithm which we show are partly responsible for the false positive and false negative compound identifications. In addition, we find a problem with MZmine 2's use of centWave. We hope that a detailed understanding of the XCMS and MZmine 2 algorithms will allow users to work with them more effectively and will also help with future algorithmic development.

  3. Save energy - for industry

    International Nuclear Information System (INIS)

    Anon.

    1983-01-01

    The article is an interview with Glenn Bjorklund, Vice President of SCalEd (Southern California Edison). The variations in Californian power demand and public electricity consumption habits are explained, together with types of power source used in electricity production. Questions are posed concerning SCalEd's energy saving strategy. The political implications of electricity charge changes are discussed. The planned energy resources for 1982-1992 are given with nuclear power being the largest contributor. (H.J.P./G.T.H.)

  4. REMINDER Saved Leave Scheme (SLS) : Transfer of leave to saved leave accounts

    CERN Multimedia

    HR Division

    2002-01-01

    Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'*) annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No. 22B) can be transferred to the saved leave account at the end of the leave year (30 September). We remind you that, since last year, unused leave of all those taking part in the saved leave scheme at the closure of the leave-year accounts is transferred automatically to the saved leave account on that date. Therefore, staff members have no administrative steps to take. In addition, the transfer, which eliminates the risk of omitting to request leave transfers and rules out calculation errors in transfer requests, will be clearly shown in the list of leave transactions that can be consulted in EDH from October 2002 onwards. Furthermore, this automatic leave transfer optimizes staff members' chances of benefiting from a saved leave bonus provided that they are still participants in the schem...

  5. Effect of packaging on physicochemical characteristics of irradiated pre-processed chicken

    International Nuclear Information System (INIS)

    Jiang Xiujie; Zhang Dongjie; Zhang Dequan; Li Shurong; Gao Meixu; Wang Zhidong

    2011-01-01

    To explore the effect of modified atmosphere packaging and antioxidants on the physicochemical characteristics of irradiated pre-processed chicken, the pre-processed chicken was added antioxidants first, and then packaged in common, vacuum and gas respectively, and finally irradiated at 5 kGy dosage. All samples was stored at 4 ℃. The pH, TBA, TVB-N and color deviation were evaluated after 0, 3, 7, 10, 14, 18 and 21 d of storage. The results showed that pH value of pre-processed chicken with antioxidants and vacuum packaged increased with the storage time but not significantly among different treatments. The TBA value was also increased but not significantly (P > 0.05), which indicated that vacuum package inhibited the lipid oxidation. TVB-N value increased with storage time, TVB-N value of vacuum package samples reached 14.29 mg/100 g at 21 d storage, which did not exceeded the reference indexes of fresh meat. a * value of the pre-processed chicken of vacuum package and non-oxygen package samples increased significantly during storage (P > 0.05), and chicken color kept bright red after 21 d storage with vacuum package It is concluded that vacuum packaging of irradiated pre-processed chicken is effective on ensuring its physical and chemical properties during storage. (authors)

  6. Examination of Speed Contribution of Parallelization for Several Fingerprint Pre-Processing Algorithms

    Directory of Open Access Journals (Sweden)

    GORGUNOGLU, S.

    2014-05-01

    Full Text Available In analysis of minutiae based fingerprint systems, fingerprints needs to be pre-processed. The pre-processing is carried out to enhance the quality of the fingerprint and to obtain more accurate minutiae points. Reducing the pre-processing time is important for identification and verification in real time systems and especially for databases holding large fingerprints information. Parallel processing and parallel CPU computing can be considered as distribution of processes over multi core processor. This is done by using parallel programming techniques. Reducing the execution time is the main objective in parallel processing. In this study, pre-processing of minutiae based fingerprint system is implemented by parallel processing on multi core computers using OpenMP and on graphics processor using CUDA to improve execution time. The execution times and speedup ratios are compared with the one that of single core processor. The results show that by using parallel processing, execution time is substantially improved. The improvement ratios obtained for different pre-processing algorithms allowed us to make suggestions on the more suitable approaches for parallelization.

  7. Entrepreneurial Saving Practices and Reinvestment

    NARCIS (Netherlands)

    Beck, Thorsten; Pamuk, Haki; Uras, Burak R.

    2017-01-01

    We use a novel enterprise survey to gauge the relationship between saving instruments and entrepreneurial reinvestment. We show that while most informal saving practices are not associated with a lower likelihood of reinvestment when compared with formal saving practices, there is a significantly

  8. Social Capital and Savings Behavior

    DEFF Research Database (Denmark)

    Newman, Carol; Tarp, Finn; Khai, Luu Duc

    In this paper, we analyze household savings in rural Vietnam paying particular attention to the factors that determine the proportion of savings held as formal deposits. Our aim is to explore the extent to which social capital can play a role in promoting formal savings behavior. Social capital...

  9. Saving water through global trade

    NARCIS (Netherlands)

    Chapagain, Ashok; Hoekstra, Arjen Ysbert; Savenije, H.H.G.

    2005-01-01

    Many nations save domestic water resources by importing water-intensive products and exporting commodities that are less water intensive. National water saving through the import of a product can imply saving water at a global level if the flow is from sites with high to sites with low water

  10. Achieving Accurate Automatic Sleep Staging on Manually Pre-processed EEG Data Through Synchronization Feature Extraction and Graph Metrics.

    Science.gov (United States)

    Chriskos, Panteleimon; Frantzidis, Christos A; Gkivogkli, Polyxeni T; Bamidis, Panagiotis D; Kourtidou-Papadeli, Chrysoula

    2018-01-01

    Sleep staging, the process of assigning labels to epochs of sleep, depending on the stage of sleep they belong, is an arduous, time consuming and error prone process as the initial recordings are quite often polluted by noise from different sources. To properly analyze such data and extract clinical knowledge, noise components must be removed or alleviated. In this paper a pre-processing and subsequent sleep staging pipeline for the sleep analysis of electroencephalographic signals is described. Two novel methods of functional connectivity estimation (Synchronization Likelihood/SL and Relative Wavelet Entropy/RWE) are comparatively investigated for automatic sleep staging through manually pre-processed electroencephalographic recordings. A multi-step process that renders signals suitable for further analysis is initially described. Then, two methods that rely on extracting synchronization features from electroencephalographic recordings to achieve computerized sleep staging are proposed, based on bivariate features which provide a functional overview of the brain network, contrary to most proposed methods that rely on extracting univariate time and frequency features. Annotation of sleep epochs is achieved through the presented feature extraction methods by training classifiers, which are in turn able to accurately classify new epochs. Analysis of data from sleep experiments on a randomized, controlled bed-rest study, which was organized by the European Space Agency and was conducted in the "ENVIHAB" facility of the Institute of Aerospace Medicine at the German Aerospace Center (DLR) in Cologne, Germany attains high accuracy rates, over 90% based on ground truth that resulted from manual sleep staging by two experienced sleep experts. Therefore, it can be concluded that the above feature extraction methods are suitable for semi-automatic sleep staging.

  11. Comparative performance evaluation of transform coding in image pre-processing

    Science.gov (United States)

    Menon, Vignesh V.; NB, Harikrishnan; Narayanan, Gayathri; CK, Niveditha

    2017-07-01

    We are in the midst of a communication transmute which drives the development as largely as dissemination of pioneering communication systems with ever-increasing fidelity and resolution. Distinguishable researches have been appreciative in image processing techniques crazed by a growing thirst for faster and easier encoding, storage and transmission of visual information. In this paper, the researchers intend to throw light on many techniques which could be worn at the transmitter-end in order to ease the transmission and reconstruction of the images. The researchers investigate the performance of different image transform coding schemes used in pre-processing, their comparison, and effectiveness, the necessary and sufficient conditions, properties and complexity in implementation. Whimsical by prior advancements in image processing techniques, the researchers compare various contemporary image pre-processing frameworks- Compressed Sensing, Singular Value Decomposition, Integer Wavelet Transform on performance. The paper exposes the potential of Integer Wavelet transform to be an efficient pre-processing scheme.

  12. Performance of Pre-processing Schemes with Imperfect Channel State Information

    DEFF Research Database (Denmark)

    Christensen, Søren Skovgaard; Kyritsi, Persa; De Carvalho, Elisabeth

    2006-01-01

    Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER and the high......Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER...... and the highest SINR when the CSI is perfect, whereas the simple matched filter may be a good choice when the CSI is imperfect. Additionally the results give insight into the inherent trade-off between robustness against CSI imperfections and spatial focusing ability....

  13. Energy. Saving 'Private' Areva

    International Nuclear Information System (INIS)

    Dupin, Ludovic

    2015-01-01

    While Areva keeps on loosing money (billions of euros for 2014), the saving of this company is at stake. Staff is already planned to be reduced in La Hague, and other staff reductions might occur after the failure of a previous strategic plan. Various activities could be sold (dismantling, mining). The article outlines the difficult relationships between Areva and EDF and the problems also faced by EDF. Some actors think that Areva should remain independent from EDF in order to be free to compete on international bidding. The rapprochement between these two companies is said to be necessary for the Ministry but seems very difficult to achieve

  14. Water Saving for Development

    Science.gov (United States)

    Zacharias, Ierotheos

    2013-04-01

    The project "Water Saving for Development (WaS4D)" is financed by European Territorial Cooperational Programme, Greece-Italy 2007-2013, and aims at developing issues on water saving related to improvement of individual behaviors and implementing innovative actions and facilities in order to harmonize policies and start concrete actions for a sustainable water management, making also people and stakeholders awake to water as a vital resource, strategic for quality of life and territory competitiveness. Drinkable water saving culture & behavior, limited water resources, water supply optimization, water resources and demand management, water e-service & educational e-tools are the key words of WaS4D. In this frame the project objectives are: • Definition of water need for domestic and other than domestic purposes: regional and territorial hydro-balance; • promotion of locally available resources not currently being used - water recycling or reuse and rainwater harvesting; • scientific data implementation into Informative Territorial System and publication of geo-referred maps into the institutional web sites, to share information for water protection; • participated review of the regulatory framework for the promotion of water-efficient devices and practices by means of the definition of Action Plans, with defined targets up to brief (2015) and medium (2020) term; • building up water e-services, front-office for all the water issues in building agricultural, industrial and touristic sectors, to share information, procedures and instruments for the water management; • creation and publication of a user friendly software, a game, to promote sustainability for houses also addressed to young people; • creation of water info point into physical spaces called "Water House" to promote education, training, events and new advisory services to assist professionals involved in water uses and consumers; • implementation of participatory approach & networking for a

  15. Savings for the Poor

    OpenAIRE

    Ignacio Mas

    2010-01-01

    This paper reviews the relevance of formal financial services – in particular, savings – to poor people, the economic factors that have hindered the mass-scale delivery of such services in developing countries, and the technology-based opportunities that exist today to make massive gains in financial inclusion. It also highlights the benefits to government from universal financial access, as well as the key policy enablers that would need to be put in place to allow the necessary innovati...

  16. Locomotive energy savings possibilities

    Directory of Open Access Journals (Sweden)

    Leonas Povilas LINGAITIS

    2009-01-01

    Full Text Available Economic indicators of electrodynamic braking have not been properly estimated. Vehicles with alternative power trains are transitional stage between development of pollution- free vehicles. According to these aspects the investigation on conventional hybrids drives and their control system is carried out in the article. The equation that allows evaluating effectiveness of regenerative braking for different variants of hybrid drive are given. Presenting different types of locomotive energy savings power systems, which are using regenerative braking energy any form of hybrid traction vehicles systems, circuit diagrams, electrical parameters curves.

  17. Comparison of classification algorithms for various methods of preprocessing radar images of the MSTAR base

    Science.gov (United States)

    Borodinov, A. A.; Myasnikov, V. V.

    2018-04-01

    The present work is devoted to comparing the accuracy of the known qualification algorithms in the task of recognizing local objects on radar images for various image preprocessing methods. Preprocessing involves speckle noise filtering and normalization of the object orientation in the image by the method of image moments and by a method based on the Hough transform. In comparison, the following classification algorithms are used: Decision tree; Support vector machine, AdaBoost, Random forest. The principal component analysis is used to reduce the dimension. The research is carried out on the objects from the base of radar images MSTAR. The paper presents the results of the conducted studies.

  18. Learning to save lives!

    CERN Document Server

    2003-01-01

    They're all around you and watch over you, but you won't be aware of them unless you look closely at their office doors. There are 308 of them and they have all been given 12 hours of training with the CERN Fire Brigade. Who are they? Quite simply, those who could one day save your life at work, the CERN first-aiders. First-aiders are recruited on a volunteer basis. "Training is in groups of 10 to 12 people and a lot of emphasis is placed on the practical to ensure that they remember the life-saving techniques we show them", explains Patrick Berlinghi, a CERN first-aid instructor from TIS Division. He is looking forward to the arrival of four new instructors, which will bring the total number to twelve (eleven firemen and one member of the Medical Service). "The new instructors were trained at CERN from 16 to 24 May by Marie-Christine Boucher Da Ros (a member of the Commission Pédagogie de l'Observatoire National Français du Secourisme, the education commission of France's national first-aid body). This in...

  19. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps.

    Science.gov (United States)

    Varikuti, Deepthi P; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T; Eickhoff, Simon B

    2017-04-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that gray matter masking improved the reliability of connectivity estimates, whereas denoising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources.

  20. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    NARCIS (Netherlands)

    Varikuti, D.P.; Hoffstaedter, F.; Genon, S.; Schwender, H.; Reid, A.T.; Eickhoff, S.B.

    2017-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional

  1. Resting-state test-retest reliability of a priori defined canonical networks over different preprocessing steps

    Science.gov (United States)

    Varikuti, Deepthi P.; Hoffstaedter, Felix; Genon, Sarah; Schwender, Holger; Reid, Andrew T.; Eickhoff, Simon B.

    2016-01-01

    Resting-state functional connectivity analysis has become a widely used method for the investigation of human brain connectivity and pathology. The measurement of neuronal activity by functional MRI, however, is impeded by various nuisance signals that reduce the stability of functional connectivity. Several methods exist to address this predicament, but little consensus has yet been reached on the most appropriate approach. Given the crucial importance of reliability for the development of clinical applications, we here investigated the effect of various confound removal approaches on the test-retest reliability of functional-connectivity estimates in two previously defined functional brain networks. Our results showed that grey matter masking improved the reliability of connectivity estimates, whereas de-noising based on principal components analysis reduced it. We additionally observed that refraining from using any correction for global signals provided the best test-retest reliability, but failed to reproduce anti-correlations between what have been previously described as antagonistic networks. This suggests that improved reliability can come at the expense of potentially poorer biological validity. Consistent with this, we observed that reliability was proportional to the retained variance, which presumably included structured noise, such as reliable nuisance signals (for instance, noise induced by cardiac processes). We conclude that compromises are necessary between maximizing test-retest reliability and removing variance that may be attributable to non-neuronal sources. PMID:27550015

  2. Parallelizing flow-accumulation calculations on graphics processing units—From iterative DEM preprocessing algorithm to recursive multiple-flow-direction algorithm

    Science.gov (United States)

    Qin, Cheng-Zhi; Zhan, Lijun

    2012-06-01

    As one of the important tasks in digital terrain analysis, the calculation of flow accumulations from gridded digital elevation models (DEMs) usually involves two steps in a real application: (1) using an iterative DEM preprocessing algorithm to remove the depressions and flat areas commonly contained in real DEMs, and (2) using a recursive flow-direction algorithm to calculate the flow accumulation for every cell in the DEM. Because both algorithms are computationally intensive, quick calculation of the flow accumulations from a DEM (especially for a large area) presents a practical challenge to personal computer (PC) users. In recent years, rapid increases in hardware capacity of the graphics processing units (GPUs) provided in modern PCs have made it possible to meet this challenge in a PC environment. Parallel computing on GPUs using a compute-unified-device-architecture (CUDA) programming model has been explored to speed up the execution of the single-flow-direction algorithm (SFD). However, the parallel implementation on a GPU of the multiple-flow-direction (MFD) algorithm, which generally performs better than the SFD algorithm, has not been reported. Moreover, GPU-based parallelization of the DEM preprocessing step in the flow-accumulation calculations has not been addressed. This paper proposes a parallel approach to calculate flow accumulations (including both iterative DEM preprocessing and a recursive MFD algorithm) on a CUDA-compatible GPU. For the parallelization of an MFD algorithm (MFD-md), two different parallelization strategies using a GPU are explored. The first parallelization strategy, which has been used in the existing parallel SFD algorithm on GPU, has the problem of computing redundancy. Therefore, we designed a parallelization strategy based on graph theory. The application results show that the proposed parallel approach to calculate flow accumulations on a GPU performs much faster than either sequential algorithms or other parallel GPU

  3. Does Daylight Saving Save Energy? A Meta-Analysis

    OpenAIRE

    Havránek, Tomáš; Herman, Dominik; Irsova, Zuzana

    2016-01-01

    The original rationale for adopting daylight saving time (DST) was energy savings. Modern research studies, however, question the magnitude and even direction of the effect of DST on energy consumption. Representing the first meta-analysis in this literature, we collect 162 estimates from 44 studies and find that the mean reported estimate indicates modest energy savings: 0.34% during the days when DST applies. The literature is not affected by publication bias, but the results vary systemati...

  4. Can lean save lives?

    Science.gov (United States)

    Fillingham, David

    2007-01-01

    The purpose of this paper is to show how over the last 18 months Bolton Hospitals NHS Trust have been exploring whether or not lean methodologies, often known as the Toyota Production System, can indeed be applied to healthcare. This paper is a viewpoint. One's early experience is that lean really can save lives. The Toyota Production System is an amazingly successful way of manufacturing cars. It cannot be simply translated unthinkingly into a hospital but lessons can be learned from it and the method can be adapted and developed so that it becomes owned by healthcare staff and focused towards the goal of improved patient care. Working in healthcare is a stressful and difficult thing. Everyone needs a touch of inspiration and encouragement. Applying lean to healthcare in Bolton seems to be achieving just that for those who work there.

  5. A Real-Time Embedded System for Stereo Vision Preprocessing Using an FPGA

    DEFF Research Database (Denmark)

    Kjær-Nielsen, Anders; Jensen, Lars Baunegaard With; Sørensen, Anders Stengaard

    2008-01-01

    In this paper a low level vision processing node for use in existing IEEE 1394 camera setups is presented. The processing node is a small embedded system, that utilizes an FPGA to perform stereo vision preprocessing at rates limited by the bandwidth of IEEE 1394a (400Mbit). The system is used...

  6. Poisson pre-processing of nonstationary photonic signals: Signals with equality between mean and variance.

    Science.gov (United States)

    Poplová, Michaela; Sovka, Pavel; Cifra, Michal

    2017-01-01

    Photonic signals are broadly exploited in communication and sensing and they typically exhibit Poisson-like statistics. In a common scenario where the intensity of the photonic signals is low and one needs to remove a nonstationary trend of the signals for any further analysis, one faces an obstacle: due to the dependence between the mean and variance typical for a Poisson-like process, information about the trend remains in the variance even after the trend has been subtracted, possibly yielding artifactual results in further analyses. Commonly available detrending or normalizing methods cannot cope with this issue. To alleviate this issue we developed a suitable pre-processing method for the signals that originate from a Poisson-like process. In this paper, a Poisson pre-processing method for nonstationary time series with Poisson distribution is developed and tested on computer-generated model data and experimental data of chemiluminescence from human neutrophils and mung seeds. The presented method transforms a nonstationary Poisson signal into a stationary signal with a Poisson distribution while preserving the type of photocount distribution and phase-space structure of the signal. The importance of the suggested pre-processing method is shown in Fano factor and Hurst exponent analysis of both computer-generated model signals and experimental photonic signals. It is demonstrated that our pre-processing method is superior to standard detrending-based methods whenever further signal analysis is sensitive to variance of the signal.

  7. Scene matching based on non-linear pre-processing on reference image and sensed image

    Institute of Scientific and Technical Information of China (English)

    Zhong Sheng; Zhang Tianxu; Sang Nong

    2005-01-01

    To solve the heterogeneous image scene matching problem, a non-linear pre-processing method for the original images before intensity-based correlation is proposed. The result shows that the proper matching probability is raised greatly. Especially for the low S/N image pairs, the effect is more remarkable.

  8. Data pre-processing: a case study in predicting student's retention in ...

    African Journals Online (AJOL)

    dataset with features that are ready for data mining task. The study also proposed a process model and suggestions, which can be applied to support more comprehensible tools for educational domain who is the end user. Subsequently, the data pre-processing become more efficient for predicting student's retention in ...

  9. Summary of ENDF/B Pre-Processing Codes June 1983

    International Nuclear Information System (INIS)

    Cullen, D.E.

    1983-06-01

    This is the summary documentation for the 1983 version of the ENDF/B Pre-Processing Codes LINEAR, RECENT, SIGMA1, GROUPIE, EVALPLOT, MERGER, DICTION, COMPLOT, CONVERT. This summary documentation is merely a copy of the comment cards that appear at the beginning of each programme; these comment cards always reflect the latest status of input options, etc

  10. Evaluation of Microarray Preprocessing Algorithms Based on Concordance with RT-PCR in Clinical Samples

    DEFF Research Database (Denmark)

    Hansen, Kasper Lage; Szallasi, Zoltan Imre; Eklund, Aron Charles

    2009-01-01

    evaluated consistency using the Pearson correlation between measurements obtained on the two platforms. Also, we introduce the log-ratio discrepancy as a more relevant measure of discordance between gene expression platforms. Of nine preprocessing algorithms tested, PLIER+16 produced expression values...

  11. Pre-processing data using wavelet transform and PCA based on ...

    Indian Academy of Sciences (India)

    Abazar Solgi

    2017-07-14

    Jul 14, 2017 ... Pre-processing data using wavelet transform and PCA based on support vector regression and gene expression programming for river flow simulation. Abazar Solgi1,*, Amir Pourhaghi1, Ramin Bahmani2 and Heidar Zarei3. 1. Department of Water Resources Engineering, Shahid Chamran University of ...

  12. Data preprocessing methods of FT-NIR spectral data for the classification cooking oil

    Science.gov (United States)

    Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli

    2014-12-01

    This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.

  13. Value of Distributed Preprocessing of Biomass Feedstocks to a Bioenergy Industry

    Energy Technology Data Exchange (ETDEWEB)

    Christopher T Wright

    2006-07-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system and the front-end of a biorefinery. Its purpose is to chop, grind, or otherwise format the biomass into a suitable feedstock for conversion to ethanol and other bioproducts. Many variables such as equipment cost and efficiency, and feedstock moisture content, particle size, bulk density, compressibility, and flowability affect the location and implementation of this unit operation. Previous conceptual designs show this operation to be located at the front-end of the biorefinery. However, data are presented that show distributed preprocessing at the field-side or in a fixed preprocessing facility can provide significant cost benefits by producing a higher value feedstock with improved handling, transporting, and merchandising potential. In addition, data supporting the preferential deconstruction of feedstock materials due to their bio-composite structure identifies the potential for significant improvements in equipment efficiencies and compositional quality upgrades. Theses data are collected from full-scale low and high capacity hammermill grinders with various screen sizes. Multiple feedstock varieties with a range of moisture values were used in the preprocessing tests. The comparative values of the different grinding configurations, feedstock varieties, and moisture levels are assessed through post-grinding analysis of the different particle fractions separated with a medium-scale forage particle separator and a Rototap separator. The results show that distributed preprocessing produces a material that has bulk flowable properties and fractionation benefits that can improve the ease of transporting, handling and conveying the material to the biorefinery and improve the biochemical and thermochemical conversion processes.

  14. Relative effects of statistical preprocessing and postprocessing on a regional hydrological ensemble prediction system

    Science.gov (United States)

    Sharma, Sanjib; Siddique, Ridwan; Reed, Seann; Ahnert, Peter; Mendoza, Pablo; Mejia, Alfonso

    2018-03-01

    The relative roles of statistical weather preprocessing and streamflow postprocessing in hydrological ensemble forecasting at short- to medium-range forecast lead times (day 1-7) are investigated. For this purpose, a regional hydrologic ensemble prediction system (RHEPS) is developed and implemented. The RHEPS is comprised of the following components: (i) hydrometeorological observations (multisensor precipitation estimates, gridded surface temperature, and gauged streamflow); (ii) weather ensemble forecasts (precipitation and near-surface temperature) from the National Centers for Environmental Prediction 11-member Global Ensemble Forecast System Reforecast version 2 (GEFSRv2); (iii) NOAA's Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM); (iv) heteroscedastic censored logistic regression (HCLR) as the statistical preprocessor; (v) two statistical postprocessors, an autoregressive model with a single exogenous variable (ARX(1,1)) and quantile regression (QR); and (vi) a comprehensive verification strategy. To implement the RHEPS, 1 to 7 days weather forecasts from the GEFSRv2 are used to force HL-RDHM and generate raw ensemble streamflow forecasts. Forecasting experiments are conducted in four nested basins in the US Middle Atlantic region, ranging in size from 381 to 12 362 km2. Results show that the HCLR preprocessed ensemble precipitation forecasts have greater skill than the raw forecasts. These improvements are more noticeable in the warm season at the longer lead times (> 3 days). Both postprocessors, ARX(1,1) and QR, show gains in skill relative to the raw ensemble streamflow forecasts, particularly in the cool season, but QR outperforms ARX(1,1). The scenarios that implement preprocessing and postprocessing separately tend to perform similarly, although the postprocessing-alone scenario is often more effective. The scenario involving both preprocessing and postprocessing consistently outperforms the other scenarios. In some cases

  15. Reproducible cancer biomarker discovery in SELDI-TOF MS using different pre-processing algorithms.

    Directory of Open Access Journals (Sweden)

    Jinfeng Zou

    Full Text Available BACKGROUND: There has been much interest in differentiating diseased and normal samples using biomarkers derived from mass spectrometry (MS studies. However, biomarker identification for specific diseases has been hindered by irreproducibility. Specifically, a peak profile extracted from a dataset for biomarker identification depends on a data pre-processing algorithm. Until now, no widely accepted agreement has been reached. RESULTS: In this paper, we investigated the consistency of biomarker identification using differentially expressed (DE peaks from peak profiles produced by three widely used average spectrum-dependent pre-processing algorithms based on SELDI-TOF MS data for prostate and breast cancers. Our results revealed two important factors that affect the consistency of DE peak identification using different algorithms. One factor is that some DE peaks selected from one peak profile were not detected as peaks in other profiles, and the second factor is that the statistical power of identifying DE peaks in large peak profiles with many peaks may be low due to the large scale of the tests and small number of samples. Furthermore, we demonstrated that the DE peak detection power in large profiles could be improved by the stratified false discovery rate (FDR control approach and that the reproducibility of DE peak detection could thereby be increased. CONCLUSIONS: Comparing and evaluating pre-processing algorithms in terms of reproducibility can elucidate the relationship among different algorithms and also help in selecting a pre-processing algorithm. The DE peaks selected from small peak profiles with few peaks for a dataset tend to be reproducibly detected in large peak profiles, which suggests that a suitable pre-processing algorithm should be able to produce peaks sufficient for identifying useful and reproducible biomarkers.

  16. 12 CFR 583.21 - Savings association.

    Science.gov (United States)

    2010-01-01

    ... AFFECTING SAVINGS AND LOAN HOLDING COMPANIES § 583.21 Savings association. The term savings association means a Federal savings and loan association or a Federal savings bank chartered under section 5 of the Home Owners' Loan Act, a building and loan, savings and loan or homestead association or a cooperative...

  17. Reinforcement Learning and Savings Behavior.

    Science.gov (United States)

    Choi, James J; Laibson, David; Madrian, Brigitte C; Metrick, Andrew

    2009-12-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)-a high average and/or low variance return-increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-varying investor-level heterogeneity that is correlated with portfolio allocations to stock, bond, and cash asset classes. We discuss implications for the equity premium puzzle and interventions aimed at improving household financial outcomes.

  18. Saving gas project

    Energy Technology Data Exchange (ETDEWEB)

    Vasques, Maria Anunciacao S. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Garantizado, Maria Auxiliadora G. [CONCREMAT Engenharia, Rio de Janeiro, RJ (Brazil)

    2009-12-19

    The work presented was implemented in municipalities around the construction of the pipeline project Urucu-Coari-Manaus, the Engineering / IETEG-IENOR, because of the constant release of workers, consequently the finishing stages of this work and its future completion. The Project aims to guide saving gas with the workforce, their families and communities to the enterprise of small business cooperatives and solidarity within the potential of the site. This project is developed through the workshops: entrepreneur ship, tourism, use, reuse and recycling of products, and hortifruiti culture, agroecology, agribusiness (cooperativism solidarity) and forestry. Its execution took place in two phases, the first called 'pilot' of 12/12/2007 to 27/03/2008 in sections A and B1, in the municipality of Coari stretch and B2 in Caapiranga. The second phase occurred from 30/06 to 27/09/08, in the words B1, in the municipalities of Codajas and Anori words and B2 in Iranduba, Manacapuru and Anama. The workshops were held in state and municipal schools and administered by the Institute of Social and Environmental Amazon - ISAM, which had a team of coordinators, teachers, experts and masters of the time until the nineteen twenty-two hours to implement the project. (author)

  19. Save energy, without entropy

    International Nuclear Information System (INIS)

    Steinmeyer, D.

    1992-01-01

    When we talk about saving energy what we usually mean is not wasting work. What we try to do when we design a process, is to use work as effectively as possible. It's hard to do that if we can't see it clearly. This paper illustrates how work can be seen (or calculated) without imposing entropy as a screen in front of it. We've all heard that the second law tells us that the entropy of the universe is increasing, and we are left with the feeling that the universe is ultimately headed for chaos, but receive little other information from this statement. A slightly more useful statement of the second law is the work potential of the universe is decreasing. However, this statement carries a needlessly negative ring. A simplified definition of the second law is: It takes work to change things. With these two corollaries: We can calculate the theoretical minimum work needed for a given change; and We can express the value of all changes in terms of work

  20. Saving-Based Asset Pricing

    DEFF Research Database (Denmark)

    Dreyer, Johannes Kabderian; Schneider, Johannes; T. Smith, William

    2013-01-01

    This paper explores the implications of a novel class of preferences for the behavior of asset prices. Following a suggestion by Marshall (1920), we entertain the possibility that people derive utility not only from consumption, but also from the very act of saving. These ‘‘saving-based’’ prefere...

  1. Geology of Woman Saving Concept

    Directory of Open Access Journals (Sweden)

    Shayesteh Madani

    2014-09-01

    Full Text Available Money is lubricant and an instrument for economic transaction. Money social dimension has increased over time, transforming it from a sole economic instrument to a device for various transactions. Money economic value in society is indicated through different forms, one of which is saving, in the sense of money accumulation and its use under specific future circumstances. Women, who form half of the society, take specific approaches to money and savings. The current research aims to investigate the perspectives and changing attitude strategy to money and saving among married women. The participants of this study include 20 to 70 year-old employed household married women who were observed phenomenologically and interviewed qualitatively on saving through.   The findings of this study demonstrated women perspectives on various types of saving, ways of saving, transfer methods, saving consumption forms and their mechanism. It also revealed that while money is an economic instrument and possess the economic material; attitudes and acts related to money are influenced by social conditions that has consequently turned saving into a social phenomenon.

  2. Prescription Program Provides Significant Savings

    Science.gov (United States)

    Rowan, James M.

    2010-01-01

    Most school districts today are looking for ways to save money without decreasing services to its staff. Retired pharmacist Tim Sylvester, a lifelong resident of Alpena Public Schools in Alpena, Michigan, presented the district with a pharmaceuticals plan that would save the district money without raising employee co-pays for prescriptions. The…

  3. Input data preprocessing method for exchange rate forecasting via neural network

    Directory of Open Access Journals (Sweden)

    Antić Dragan S.

    2014-01-01

    Full Text Available The aim of this paper is to present a method for neural network input parameters selection and preprocessing. The purpose of this network is to forecast foreign exchange rates using artificial intelligence. Two data sets are formed for two different economic systems. Each system is represented by six categories with 70 economic parameters which are used in the analysis. Reduction of these parameters within each category was performed by using the principal component analysis method. Component interdependencies are established and relations between them are formed. Newly formed relations were used to create input vectors of a neural network. The multilayer feed forward neural network is formed and trained using batch training. Finally, simulation results are presented and it is concluded that input data preparation method is an effective way for preprocessing neural network data. [Projekat Ministarstva nauke Republike Srbije, br.TR 35005, br. III 43007 i br. III 44006

  4. Parallel finite elements with domain decomposition and its pre-processing

    International Nuclear Information System (INIS)

    Yoshida, A.; Yagawa, G.; Hamada, S.

    1993-01-01

    This paper describes a parallel finite element analysis using a domain decomposition method, and the pre-processing for the parallel calculation. Computer simulations are about to replace experiments in various fields, and the scale of model to be simulated tends to be extremely large. On the other hand, computational environment has drastically changed in these years. Especially, parallel processing on massively parallel computers or computer networks is considered to be promising techniques. In order to achieve high efficiency on such parallel computation environment, large granularity of tasks, a well-balanced workload distribution are key issues. It is also important to reduce the cost of pre-processing in such parallel FEM. From the point of view, the authors developed the domain decomposition FEM with the automatic and dynamic task-allocation mechanism and the automatic mesh generation/domain subdivision system for it. (author)

  5. Application of preprocessing filtering on Decision Tree C4.5 and rough set theory

    Science.gov (United States)

    Chan, Joseph C. C.; Lin, Tsau Y.

    2001-03-01

    This paper compares two artificial intelligence methods: the Decision Tree C4.5 and Rough Set Theory on the stock market data. The Decision Tree C4.5 is reviewed with the Rough Set Theory. An enhanced window application is developed to facilitate the pre-processing filtering by introducing the feature (attribute) transformations, which allows users to input formulas and create new attributes. Also, the application produces three varieties of data set with delaying, averaging, and summation. The results prove the improvement of pre-processing by applying feature (attribute) transformations on Decision Tree C4.5. Moreover, the comparison between Decision Tree C4.5 and Rough Set Theory is based on the clarity, automation, accuracy, dimensionality, raw data, and speed, which is supported by the rules sets generated by both algorithms on three different sets of data.

  6. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  7. ENDF/B Pre-Processing Codes: Implementing and testing on a Personal Computer

    International Nuclear Information System (INIS)

    McLaughlin, P.K.

    1987-05-01

    This document describes the contents of the diskettes containing the ENDF/B Pre-Processing codes by D.E. Cullen, and example data for use in implementing and testing these codes on a Personal Computer of the type IBM-PC/AT. Upon request the codes are available from the IAEA Nuclear Data Section, free of charge, on a series of 7 diskettes. (author)

  8. Effect of pre-processing on the physico-chemical properties of ...

    African Journals Online (AJOL)

    The findings indicated that the pre-processing treatments produced significant differences (p < 0.05) in protein (1.50 ± 0.18g/100g) and carbohydrate (1.09 ± 0.94g/100g) composition of the baking soda blanched milk sample. The viscosity of the baking soda blanched milk (18.91 ± 3.38cps) was significantly higher than that ...

  9. A clinical evaluation of the RNCA study using Fourier filtering as a preprocessing method

    Energy Technology Data Exchange (ETDEWEB)

    Robeson, W.; Alcan, K.E.; Graham, M.C.; Palestro, C.; Oliver, F.H.; Benua, R.S.

    1984-06-01

    Forty-one patients (25 male, 16 female) were studied by Radionuclide Cardangiography (RNCA) in our institution. There were 42 rest studies and 24 stress studies (66 studies total). Sixteen patients were normal, 15 had ASHD, seven had a cardiomyopathy, and three had left-sided valvular regurgitation. Each study was preprocessed using both the standard nine-point smoothing method and Fourier filtering. Amplitude and phase images were also generated. Both preprocessing methods were compared with respect to image quality, border definition, reliability and reproducibility of the LVEF, and cine wall motion interpretation. Image quality and border definition were judged superior by the consensus of two independent observers in 65 of 66 studies (98%) using Fourier filtered data. The LVEF differed between the two processes by greater than .05 in 17 of 66 studies (26%) including five studies in which the LVEF could not be determined using nine-point smoothed data. LV wall motion was normal by both techniques in all control patients by cine analysis. However, cine wall motion analysis using Fourier filtered data demonstrated additional abnormalities in 17 of 25 studies (68%) in the ASHD group, including three uninterpretable studies using nine-point smoothed data. In the cardiomyopathy/valvular heart disease group, ten of 18 studies (56%) had additional wall motion abnormalities using Fourier filtered data (including four uninterpretable studies using nine-point smoothed data). We conclude that Fourier filtering is superior to the nine-point smooth preprocessing method now in general use in terms of image quality, border definition, generation of an LVEF, and cine wall motion analysis. The advent of the array processor makes routine preprocessing by Fourier filtering a feasible technologic advance in the development of the RNCA study.

  10. Review of Data Preprocessing Methods for Sign Language Recognition Systems based on Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Zorins Aleksejs

    2016-12-01

    Full Text Available The article presents an introductory analysis of relevant research topic for Latvian deaf society, which is the development of the Latvian Sign Language Recognition System. More specifically the data preprocessing methods are discussed in the paper and several approaches are shown with a focus on systems based on artificial neural networks, which are one of the most successful solutions for sign language recognition task.

  11. Evaluation of a Stereo Music Preprocessing Scheme for Cochlear Implant Users.

    Science.gov (United States)

    Buyens, Wim; van Dijk, Bas; Moonen, Marc; Wouters, Jan

    2018-01-01

    Although for most cochlear implant (CI) users good speech understanding is reached (at least in quiet environments), the perception and the appraisal of music are generally unsatisfactory. The improvement in music appraisal was evaluated in CI participants by using a stereo music preprocessing scheme implemented on a take-home device, in a comfortable listening environment. The preprocessing allowed adjusting the balance among vocals/bass/drums and other instruments, and was evaluated for different genres of music. The correlation between the preferred settings and the participants' speech and pitch detection performance was investigated. During the initial visit preceding the take-home test, the participants' speech-in-noise perception and pitch detection performance were measured, and a questionnaire about their music involvement was completed. The take-home device was provided, including the stereo music preprocessing scheme and seven playlists with six songs each. The participants were asked to adjust the balance by means of a turning wheel to make the music sound most enjoyable, and to repeat this three times for all songs. Twelve postlingually deafened CI users participated in the study. The data were collected by means of a take-home device, which preserved all the preferred settings for the different songs. Statistical analysis was done with a Friedman test (with post hoc Wilcoxon signed-rank test) to check the effect of "Genre." The correlations were investigated with Pearson's and Spearman's correlation coefficients. All participants preferred a balance significantly different from the original balance. Differences across participants were observed which could not be explained by perceptual abilities. An effect of "Genre" was found, showing significantly smaller preferred deviation from the original balance for Golden Oldies compared to the other genres. The stereo music preprocessing scheme showed an improvement in music appraisal with complex music and

  12. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. A clinical evaluation of the RNCA study using Fourier filtering as a preprocessing method

    International Nuclear Information System (INIS)

    Robeson, W.; Alcan, K.E.; Graham, M.C.; Palestro, C.; Oliver, F.H.; Benua, R.S.

    1984-01-01

    Forty-one patients (25 male, 16 female) were studied by Radionuclide Cardangiography (RNCA) in our institution. There were 42 rest studies and 24 stress studies (66 studies total). Sixteen patients were normal, 15 had ASHD, seven had a cardiomyopathy, and three had left-sided valvular regurgitation. Each study was preprocessed using both the standard nine-point smoothing method and Fourier filtering. Amplitude and phase images were also generated. Both preprocessing methods were compared with respect to image quality, border definition, reliability and reproducibility of the LVEF, and cine wall motion interpretation. Image quality and border definition were judged superior by the consensus of two independent observers in 65 of 66 studies (98%) using Fourier filtered data. The LVEF differed between the two processes by greater than .05 in 17 of 66 studies (26%) including five studies in which the LVEF could not be determined using nine-point smoothed data. LV wall motion was normal by both techniques in all control patients by cine analysis. However, cine wall motion analysis using Fourier filtered data demonstrated additional abnormalities in 17 of 25 studies (68%) in the ASHD group, including three uninterpretable studies using nine-point smoothed data. In the cardiomyopathy/valvular heart disease group, ten of 18 studies (56%) had additional wall motion abnormalities using Fourier filtered data (including four uninterpretable studies using nine-point smoothed data). We conclude that Fourier filtering is superior to the nine-point smooth preprocessing method now in general use in terms of image quality, border definition, generation of an LVEF, and cine wall motion analysis. The advent of the array processor makes routine preprocessing by Fourier filtering a feasible technologic advance in the development of the RNCA study

  14. REMINDER Saved Leave Scheme (SLS) : Simplified procedure for the transfer of leave to saved leave accounts

    CERN Multimedia

    HR Division

    2001-01-01

    As part of the process of streamlining procedures, the HR and AS Divisions have jointly developed a system whereby annual and compensatory leave will henceforth be automatically transferred1) to saved leave accounts. Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'2)Previously, every person taking part in the scheme has been individually issued with a form for the purposes of requesting the transfer of leave to the leave account and the transfer has then had to be done manually by HR Division. To streamline the procedure, unused leave of all those taking part in the saved leave scheme at the closure of the leave-year accounts will henceforth be transferred automatically to the saved leave account on that date. This simplification is in the interest of all parties concerned. This automatic transfer procedure has a number of advantages for participants in the SLS scheme. First, staff members will no longer have to take any administrative steps. Secondly, the new proced...

  15. Preprocessing with Photoshop Software on Microscopic Images of A549 Cells in Epithelial-Mesenchymal Transition.

    Science.gov (United States)

    Ren, Zhou-Xin; Yu, Hai-Bin; Shen, Jun-Ling; Li, Ya; Li, Jian-Sheng

    2015-06-01

    To establish a preprocessing method for cell morphometry in microscopic images of A549 cells in epithelial-mesenchymal transition (EMT). Adobe Photoshop CS2 (Adobe Systems, Inc.) was used for preprocessing the images. First, all images were processed for size uniformity and high distinguishability between the cell and background area. Then, a blank image with the same size and grids was established and cross points of the grids were added into a distinct color. The blank image was merged into a processed image. In the merged images, the cells with 1 or more cross points were chosen, and then the cell areas were enclosed and were replaced in a distinct color. Except for chosen cellular areas, all areas were changed into a unique hue. Three observers quantified roundness of cells in images with the image preprocess (IPP) or without the method (Controls), respectively. Furthermore, 1 observer measured the roundness 3 times with the 2 methods, respectively. The results between IPPs and Controls were compared for repeatability and reproducibility. As compared with the Control method, among 3 observers, use of the IPP method resulted in a higher number and a higher percentage of same-chosen cells in an image. The relative average deviation values of roundness, either for 3 observers or 1 observer, were significantly higher in Controls than in IPPs (p Photoshop, a chosen cell from an image was more objective, regular, and accurate, creating an increase of reproducibility and repeatability on morphometry of A549 cells in epithelial to mesenchymal transition.

  16. Characterizing the continuously acquired cardiovascular time series during hemodialysis, using median hybrid filter preprocessing noise reduction

    Directory of Open Access Journals (Sweden)

    Wilson S

    2015-01-01

    Full Text Available Scott Wilson,1,2 Andrea Bowyer,3 Stephen B Harrap4 1Department of Renal Medicine, The Alfred Hospital, 2Baker IDI, Melbourne, 3Department of Anaesthesia, Royal Melbourne Hospital, 4University of Melbourne, Parkville, VIC, Australia Abstract: The clinical characterization of cardiovascular dynamics during hemodialysis (HD has important pathophysiological implications in terms of diagnostic, cardiovascular risk assessment, and treatment efficacy perspectives. Currently the diagnosis of significant intradialytic systolic blood pressure (SBP changes among HD patients is imprecise and opportunistic, reliant upon the presence of hypotensive symptoms in conjunction with coincident but isolated noninvasive brachial cuff blood pressure (NIBP readings. Considering hemodynamic variables as a time series makes a continuous recording approach more desirable than intermittent measures; however, in the clinical environment, the data signal is susceptible to corruption due to both impulsive and Gaussian-type noise. Signal preprocessing is an attractive solution to this problem. Prospectively collected continuous noninvasive SBP data over the short-break intradialytic period in ten patients was preprocessed using a novel median hybrid filter (MHF algorithm and compared with 50 time-coincident pairs of intradialytic NIBP measures from routine HD practice. The median hybrid preprocessing technique for continuously acquired cardiovascular data yielded a dynamic regression without significant noise and artifact, suitable for high-level profiling of time-dependent SBP behavior. Signal accuracy is highly comparable with standard NIBP measurement, with the added clinical benefit of dynamic real-time hemodynamic information. Keywords: continuous monitoring, blood pressure

  17. Characterizing the continuously acquired cardiovascular time series during hemodialysis, using median hybrid filter preprocessing noise reduction.

    Science.gov (United States)

    Wilson, Scott; Bowyer, Andrea; Harrap, Stephen B

    2015-01-01

    The clinical characterization of cardiovascular dynamics during hemodialysis (HD) has important pathophysiological implications in terms of diagnostic, cardiovascular risk assessment, and treatment efficacy perspectives. Currently the diagnosis of significant intradialytic systolic blood pressure (SBP) changes among HD patients is imprecise and opportunistic, reliant upon the presence of hypotensive symptoms in conjunction with coincident but isolated noninvasive brachial cuff blood pressure (NIBP) readings. Considering hemodynamic variables as a time series makes a continuous recording approach more desirable than intermittent measures; however, in the clinical environment, the data signal is susceptible to corruption due to both impulsive and Gaussian-type noise. Signal preprocessing is an attractive solution to this problem. Prospectively collected continuous noninvasive SBP data over the short-break intradialytic period in ten patients was preprocessed using a novel median hybrid filter (MHF) algorithm and compared with 50 time-coincident pairs of intradialytic NIBP measures from routine HD practice. The median hybrid preprocessing technique for continuously acquired cardiovascular data yielded a dynamic regression without significant noise and artifact, suitable for high-level profiling of time-dependent SBP behavior. Signal accuracy is highly comparable with standard NIBP measurement, with the added clinical benefit of dynamic real-time hemodynamic information.

  18. [Study of near infrared spectral preprocessing and wavelength selection methods for endometrial cancer tissue].

    Science.gov (United States)

    Zhao, Li-Ting; Xiang, Yu-Hong; Dai, Yin-Mei; Zhang, Zhuo-Yong

    2010-04-01

    Near infrared spectroscopy was applied to measure the tissue slice of endometrial tissues for collecting the spectra. A total of 154 spectra were obtained from 154 samples. The number of normal, hyperplasia, and malignant samples was 36, 60, and 58, respectively. Original near infrared spectra are composed of many variables, for example, interference information including instrument errors and physical effects such as particle size and light scatter. In order to reduce these influences, original spectra data should be performed with different spectral preprocessing methods to compress variables and extract useful information. So the methods of spectral preprocessing and wavelength selection have played an important role in near infrared spectroscopy technique. In the present paper the raw spectra were processed using various preprocessing methods including first derivative, multiplication scatter correction, Savitzky-Golay first derivative algorithm, standard normal variate, smoothing, and moving-window median. Standard deviation was used to select the optimal spectral region of 4 000-6 000 cm(-1). Then principal component analysis was used for classification. Principal component analysis results showed that three types of samples could be discriminated completely and the accuracy almost achieved 100%. This study demonstrated that near infrared spectroscopy technology and chemometrics method could be a fast, efficient, and novel means to diagnose cancer. The proposed methods would be a promising and significant diagnosis technique of early stage cancer.

  19. 12 CFR 561.43 - Savings association.

    Science.gov (United States)

    2010-01-01

    ..., chartered under section 5 of the Act, or a building and loan, savings and loan, or homestead association, or... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Savings association. 561.43 Section 561.43... AFFECTING ALL SAVINGS ASSOCIATIONS § 561.43 Savings association. The term savings association means a...

  20. Incremental Learning of Medical Data for Multi-Step Patient Health Classification

    DEFF Research Database (Denmark)

    Kranen, Philipp; Müller, Emmanuel; Assent, Ira

    2010-01-01

    of textile sensors, body sensors and preprocessing techniques as well as the integration and merging of sensor data in electronic health record systems. Emergency detection on multiple levels will show the benefits of multi-step classification and further enhance the scalability of emergency detection...

  1. Reinforcement Learning and Savings Behavior*

    Science.gov (United States)

    Choi, James J.; Laibson, David; Madrian, Brigitte C.; Metrick, Andrew

    2009-01-01

    We show that individual investors over-extrapolate from their personal experience when making savings decisions. Investors who experience particularly rewarding outcomes from saving in their 401(k)—a high average and/or low variance return—increase their 401(k) savings rate more than investors who have less rewarding experiences with saving. This finding is not driven by aggregate time-series shocks, income effects, rational learning about investing skill, investor fixed effects, or time-varying investor-level heterogeneity that is correlated with portfolio allocations to stock, bond, and cash asset classes. We discuss implications for the equity premium puzzle and interventions aimed at improving household financial outcomes. PMID:20352013

  2. Overview of contractual savings institutions

    OpenAIRE

    Vittas, Dimitri; Skully, Michael

    1991-01-01

    Contractual savings institutions include national provident funds, life insurance companies, private pension funds, and funded social pension insurance systems. They have long-term liabilities and stable cash flows and are therefore ideal providers of term finance, not only to government and industry, but also to municipal authorities and the housing sector. Except for Singapore, Malaysia, and a few other countries, most developing countries have small and insignificant contractual savings in...

  3. The rise of corporate savings

    OpenAIRE

    Roc Armenter

    2012-01-01

    Over the past few decades, several developed economies have experienced large changes in how much households and firms save. In fact, a sharp increase in firms’ savings behavior has changed the net position of the (nonfinancial) corporate sector vis-à-vis the rest of the economy. ; Why have firms in the business of producing goods or services become lenders? This is quite at odds with traditional models of corporate finance, which suggest that firms issue debt and equity to fund their operati...

  4. Energy savings: persuasion and persistence

    Energy Technology Data Exchange (ETDEWEB)

    Eijadi, David; McDougall, Tom; Leaf, Kris; Douglas, Jim; Steinbock, Jason; Reimer, Paul [The Weidt Group, Minnetonka, MN (United States); Gauthier, Julia [Xcel Energy, Minneapolis, MN (United States); Wild, Doug; Richards McDaniel, Stephanie [BWBR Architects, Inc., Saint Paul, MN (United States)

    2005-07-01

    In this study, the architects, sponsoring utility and energy simulation specialist joined together to investigate the persistence of energy savings in three completed projects: a college library; a municipal transportation facility; and a hospital. The primary question being 'How well did the design decisions made with the help of simulation analysis translate into building operations over several years?' Design simulation and metered performance data are compared for specific energy-saving strategies. The paper provides a brief overview of the basis of selection of the three projects, the energy design assistance methods employed and the decisions made, along with their savings expectations. For each case, design characteristics, modelling assumptions, selected strategies and actual metered performance are outlined. We find evidence of appropriate levels of energy conservation, but they are not the absolute values predicted. In each case, the discrepancies between modelling assumptions and final construction or operating procedures are identified, examined and rectified. The paper illustrates that while owners are saving energy, they are not always getting the full savings potential for what they install. The paper concludes with a re-examination of the overall process. It evaluates the potential for additional savings of individual technologies and related larger utility incentives to design teams and building owners.

  5. Application of stepping motor

    International Nuclear Information System (INIS)

    1980-10-01

    This book is divided into three parts, which is about practical using of stepping motor. The first part has six chapters. The contents of the first part are about stepping motor, classification of stepping motor, basic theory og stepping motor, characteristic and basic words, types and characteristic of stepping motor in hybrid type and basic control of stepping motor. The second part deals with application of stepping motor with hardware of stepping motor control, stepping motor control by microcomputer and software of stepping motor control. The last part mentions choice of stepping motor system, examples of stepping motor, measurement of stepping motor and practical cases of application of stepping motor.

  6. Thresholding: A Pixel-Level Image Processing Methodology Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low-level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper is thresholding. We also try to analyze the results obtained by the pixel-level processing algorithms.

  7. Social Security and Saving: A Time-Series Econometrics Pedagogical Example (With "R" Code)

    Science.gov (United States)

    Wassell, Charles S., Jr.

    2018-01-01

    In 1974, and then again in 1996, Martin Feldstein published studies of the impact of the Social Security system on private saving in the U.S. economy. He found that Social Security depressed personal saving by a substantial amount--up to 50 percent. The author uses the Feldstein data and empirical models in this article to illustrate the steps in…

  8. 76 FR 16477 - General Reporting and Recordkeeping by Savings Associations and Savings and Loan Holding Companies

    Science.gov (United States)

    2011-03-23

    ... Savings Associations and Savings and Loan Holding Companies AGENCY: Office of Thrift Supervision (OTS...), 12 CFR 562.4 (audit of savings association, savings and loan holding company, or affiliate), 12 CFR... the savings association), 12 CFR 584.1(f) (books and records of each savings and loan holding company...

  9. Optimal production scheduling for energy efficiency improvement in biofuel feedstock preprocessing considering work-in-process particle separation

    International Nuclear Information System (INIS)

    Li, Lin; Sun, Zeyi; Yao, Xufeng; Wang, Donghai

    2016-01-01

    Biofuel is considered a promising alternative to traditional liquid transportation fuels. The large-scale substitution of biofuel can greatly enhance global energy security and mitigate greenhouse gas emissions. One major concern of the broad adoption of biofuel is the intensive energy consumption in biofuel manufacturing. This paper focuses on the energy efficiency improvement of biofuel feedstock preprocessing, a major process of cellulosic biofuel manufacturing. An improved scheme of the feedstock preprocessing considering work-in-process particle separation is introduced to reduce energy waste and improve energy efficiency. A scheduling model based on the improved scheme is also developed to identify an optimal production schedule that can minimize the energy consumption of the feedstock preprocessing under production target constraint. A numerical case study is used to illustrate the effectiveness of the proposed method. The research outcome is expected to improve the energy efficiency and enhance the environmental sustainability of biomass feedstock preprocessing. - Highlights: • A novel method to schedule production in biofuel feedstock preprocessing process. • Systems modeling approach is used. • Capable of optimize preprocessing to reduce energy waste and improve energy efficiency. • A numerical case is used to illustrate the effectiveness of the method. • Energy consumption per unit production can be significantly reduced.

  10. Automated procedures for sizing aerospace vehicle structures /SAVES/

    Science.gov (United States)

    Giles, G. L.; Blackburn, C. L.; Dixon, S. C.

    1972-01-01

    Results from a continuing effort to develop automated methods for structural design are described. A system of computer programs presently under development called SAVES is intended to automate the preliminary structural design of a complete aerospace vehicle. Each step in the automated design process of the SAVES system of programs is discussed, with emphasis placed on use of automated routines for generation of finite-element models. The versatility of these routines is demonstrated by structural models generated for a space shuttle orbiter, an advanced technology transport,n hydrogen fueled Mach 3 transport. Illustrative numerical results are presented for the Mach 3 transport wing.

  11. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    Science.gov (United States)

    D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina

    2016-02-01

    In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  12. Cloudy Solar Software - Enhanced Capabilities for Finding, Pre-processing, and Visualizing Solar Data

    Science.gov (United States)

    Istvan Etesi, Laszlo; Tolbert, K.; Schwartz, R.; Zarro, D.; Dennis, B.; Csillaghy, A.

    2010-05-01

    In our project "Extending the Virtual Solar Observatory (VSO)” we have combined some of the features available in Solar Software (SSW) to produce an integrated environment for data analysis, supporting the complete workflow from data location, retrieval, preparation, and analysis to creating publication-quality figures. Our goal is an integrated analysis experience in IDL, easy-to-use but flexible enough to allow more sophisticated procedures such as multi-instrument analysis. To that end, we have made the transition from a locally oriented setting where all the analysis is done on the user's computer, to an extended analysis environment where IDL has access to services available on the Internet. We have implemented a form of Cloud Computing that uses the VSO search and a new data retrieval and pre-processing server (PrepServer) that provides remote execution of instrument-specific data preparation. We have incorporated the interfaces to the VSO search and the PrepServer into an IDL widget (SHOW_SYNOP) that provides user-friendly searching and downloading of raw solar data and optionally sends search results for pre-processing to the PrepServer prior to downloading the data. The raw and pre-processed data can be displayed with our plotting suite, PLOTMAN, which can handle different data types (light curves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. PLOTMAN is highly configurable and suited for visual data analysis and for creating publishable figures. PLOTMAN and SHOW_SYNOP work hand-in-hand for a convenient working environment. Our environment supports a growing number of solar instruments that currently includes RHESSI, SOHO/EIT, TRACE, SECCHI/EUVI, HINODE/XRT, and HINODE/EIS.

  13. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  14. Preprocessing Raw Data in Clinical Medicine for a Data Mining Purpose

    Directory of Open Access Journals (Sweden)

    Peterková Andrea

    2016-12-01

    Full Text Available Dealing with data from the field of medicine is nowadays very current and difficult. On a global scale, a large amount of medical data is produced on an everyday basis. For the purpose of our research, we understand medical data as data about patients like results from laboratory analysis, results from screening examinations (CT, ECHO and clinical parameters. This data is usually in a raw format, difficult to understand, non-standard and not suitable for further processing or analysis. This paper aims to describe the possible method of data preparation and preprocessing of such raw medical data into a form, where further analysis algorithms can be applied.

  15. Classification-based comparison of pre-processing methods for interpretation of mass spectrometry generated clinical datasets

    Directory of Open Access Journals (Sweden)

    Hoefsloot Huub CJ

    2009-05-01

    Full Text Available Abstract Background Mass spectrometry is increasingly being used to discover proteins or protein profiles associated with disease. Experimental design of mass-spectrometry studies has come under close scrutiny and the importance of strict protocols for sample collection is now understood. However, the question of how best to process the large quantities of data generated is still unanswered. Main challenges for the analysis are the choice of proper pre-processing and classification methods. While these two issues have been investigated in isolation, we propose to use the classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Results Two in-house generated clinical SELDI-TOF MS datasets are used in this study as an example of high throughput mass-spectrometry data. We perform a systematic comparison of two commonly used pre-processing methods as implemented in Ciphergen ProteinChip Software and in the Cromwell package. With respect to reproducibility, Ciphergen and Cromwell pre-processing are largely comparable. We find that the overlap between peaks detected by either Ciphergen ProteinChip Software or Cromwell is large. This is especially the case for the more stringent peak detection settings. Moreover, similarity of the estimated intensities between matched peaks is high. We evaluate the pre-processing methods using five different classification methods. Classification is done in a double cross-validation protocol using repeated random sampling to obtain an unbiased estimate of classification accuracy. No pre-processing method significantly outperforms the other for all peak detection settings evaluated. Conclusion We use classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Both pre-processing methods lead to similar classification results on an ovarian cancer and a Gaucher disease dataset. However, the settings for pre-processing

  16. Step out - Step in Sequencing Games

    NARCIS (Netherlands)

    Musegaas, M.; Borm, P.E.M.; Quant, M.

    2014-01-01

    In this paper a new class of relaxed sequencing games is introduced: the class of Step out - Step in sequencing games. In this relaxation any player within a coalition is allowed to step out from his position in the processing order and to step in at any position later in the processing order.

  17. Step out-step in sequencing games

    NARCIS (Netherlands)

    Musegaas, Marieke; Borm, Peter; Quant, Marieke

    2015-01-01

    In this paper a new class of relaxed sequencing games is introduced: the class of Step out–Step in sequencing games. In this relaxation any player within a coalition is allowed to step out from his position in the processing order and to step in at any position later in the processing order. First,

  18. Control Evaluation Information System Savings

    Directory of Open Access Journals (Sweden)

    Eddy Sutedjo

    2011-05-01

    Full Text Available The purpose of this research is to evaluate the control of information system savings in the banking and to identify the weaknesses and problem happened in those saving systems. Research method used are book studies by collecting data and information needed and field studies by interview, observation, questioner, and checklist using COBIT method as a standard to assess the information system control of the company. The expected result about the evaluation result that show in the problem happened and recommendation given as the evaluation report and to give a view about the control done by the company. Conclusion took from this research that this banking company has met standards although some weaknesses still exists in the system.Index Terms - Control Information System, Savings

  19. Are Women Empowered to Save?

    Directory of Open Access Journals (Sweden)

    Frances Woolley

    2013-12-01

    Full Text Available Female economic empowerment – rising earnings, increased opportunities, greater labour force participation – has given many women the means to save. The shifting of responsibility for retirement security from employers and governments onto individuals has given women a reason to save. But are women actually saving? In this paper, we explore the relationship between the gender dynamics within a family and the accumulation of wealth. We find that little evidence in support of the conventional wisdom that families with a female financial manager save more and repay their debts more often. We find some evidence that male financial management leads to greater savings, and other evidence suggesting that savings patterns have a complex relationship with intra-family gender dynamics. El empoderamiento económico de la mujer – el aumento de los ingresos, mayores oportunidades, mayor participación laboral – ha dado a muchas mujeres los medios para ahorrar. Al pasar la responsabilidad de los ingresos de la jubilación de los empleadores y el gobierno a los individuos ha dado a las mujeres un motivo para ahorrar. ¿Pero realmente ahorran las mujeres? En este artículo se analizan las relaciones entre las dinámicas de género en una familia, y la acumulación de riqueza. Se ha llegado a la conclusión de que hay poca evidencia que apoye la creencia convencional de que las familias en las que una mujer gestiona las financias ahorran más y devuelven sus créditos más frecuentemente. Se ha encontrado alguna evidencia de que la gestión financiera por varones acarrea mayores ahorros, y otras evidencias que sugieren que los patrones de ahorro tienen una relación compleja con las dinámicas de género dentro de la familia.

  20. Energy supply and energy saving in Ukraine

    Directory of Open Access Journals (Sweden)

    V.M. Ilchenko

    2015-09-01

    Full Text Available The article examines the main problems and solutions of energy saving and energy supply in Ukraine. Low energy efficiency has become one of the main factors of the crisis in the Ukrainian economy. The most relevant scientific and methodical approaches to assessment of the level of energy consumption and saving are indicated. The comparative analysis of annual energy use has been made. A potential to solve energy supply problems is strongly correlated with the ability to ensure the innovative development of economy for efficient and economical use of existing and imported energy resources. The ways for reducing of energy resource consumption have been suggested. Creation of technological conditions for the use of alternative energy sources is considered to be rational also. The development of renewable sources of energy (alternative and renewable energy sources will provide a significant effect in reducing the use of traditional energy sources, harmful emissions and greenhouse gas. Under these conditions, increasing of energy efficiency of economy and its competitiveness can be real. Improvement of environmental and social conditions of citizens of the country will mark a positive step towards the EU, and also will cancel some problems of the future generation.

  1. Saving Face and Group Identity

    DEFF Research Database (Denmark)

    Eriksson, Tor; Mao, Lei; Villeval, Marie-Claire

    2015-01-01

    their self- but also other group members' image. This behavior is frequent even in the absence of group identity. When group identity is more salient, individuals help regardless of whether the least performer is an in-group or an out-group. This suggests that saving others' face is a strong social norm.......Are people willing to sacrifice resources to save one's and others' face? In a laboratory experiment, we study whether individuals forego resources to avoid the public exposure of the least performer in their group. We show that a majority of individuals are willing to pay to preserve not only...

  2. Improved methods to evaluate realised energy savings

    NARCIS (Netherlands)

    Boonekamp, P.G.M.

    2005-01-01

    This thesis regards the calculation of realised energy savings at national and sectoral level, and the policy contribution to total savings. It is observed that the results of monitoring and evaluation studies on realised energy savings are hardly applied in energy saving policy. Causes are the lack

  3. Effects of Preprocessing on Multi-Direction Properties of Aluminum Alloy Cold-Spray Deposits

    Science.gov (United States)

    Rokni, M. R.; Nardi, A. T.; Champagne, V. K.; Nutt, S. R.

    2018-05-01

    The effects of powder preprocessing (degassing at 400 °C for 6 h) on microstructure and mechanical properties of 5056 aluminum deposits produced by high-pressure cold spray were investigated. To investigate directionality of the mechanical properties, microtensile coupons were excised from different directions of the deposit, i.e., longitudinal, short transverse, long transverse, and diagonal and then tested. The results were compared to properties of wrought 5056 and the coating deposited with as-received 5056 Al powder and correlated with the observed microstructures. Preprocessing softened the particles and eliminated the pores within them, resulting in more extensive and uniform deformation upon impact with the substrate and with underlying deposited material. Microstructural characterization and finite element simulation indicated that upon particle impact, the peripheral regions experienced more extensive deformation and higher temperatures than the central contact zone. This led to more recrystallization and stronger bonding at peripheral regions relative to the contact zone area and yielded superior properties in the longitudinal direction compared with the short transverse direction. Fractography revealed that crack propagation takes place along the particle-particle interfaces in the transverse directions (caused by insufficient bonding and recrystallization), whereas through the deposited particles, fracture is dominant in the longitudinal direction.

  4. A review of blood sample handling and pre-processing for metabolomics studies.

    Science.gov (United States)

    Hernandes, Vinicius Veri; Barbas, Coral; Dudzik, Danuta

    2017-09-01

    Metabolomics has been found to be applicable to a wide range of clinical studies, bringing a new era for improving clinical diagnostics, early disease detection, therapy prediction and treatment efficiency monitoring. A major challenge in metabolomics, particularly untargeted studies, is the extremely diverse and complex nature of biological specimens. Despite great advances in the field there still exist fundamental needs for considering pre-analytical variability that can introduce bias to the subsequent analytical process and decrease the reliability of the results and moreover confound final research outcomes. Many researchers are mainly focused on the instrumental aspects of the biomarker discovery process, and sample related variables sometimes seem to be overlooked. To bridge the gap, critical information and standardized protocols regarding experimental design and sample handling and pre-processing are highly desired. Characterization of a range variation among sample collection methods is necessary to prevent results misinterpretation and to ensure that observed differences are not due to an experimental bias caused by inconsistencies in sample processing. Herein, a systematic discussion of pre-analytical variables affecting metabolomics studies based on blood derived samples is performed. Furthermore, we provide a set of recommendations concerning experimental design, collection, pre-processing procedures and storage conditions as a practical review that can guide and serve for the standardization of protocols and reduction of undesirable variation. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Influence of Averaging Preprocessing on Image Analysis with a Markov Random Field Model

    Science.gov (United States)

    Sakamoto, Hirotaka; Nakanishi-Ohno, Yoshinori; Okada, Masato

    2018-02-01

    This paper describes our investigations into the influence of averaging preprocessing on the performance of image analysis. Averaging preprocessing involves a trade-off: image averaging is often undertaken to reduce noise while the number of image data available for image analysis is decreased. We formulated a process of generating image data by using a Markov random field (MRF) model to achieve image analysis tasks such as image restoration and hyper-parameter estimation by a Bayesian approach. According to the notions of Bayesian inference, posterior distributions were analyzed to evaluate the influence of averaging. There are three main results. First, we found that the performance of image restoration with a predetermined value for hyper-parameters is invariant regardless of whether averaging is conducted. We then found that the performance of hyper-parameter estimation deteriorates due to averaging. Our analysis of the negative logarithm of the posterior probability, which is called the free energy based on an analogy with statistical mechanics, indicated that the confidence of hyper-parameter estimation remains higher without averaging. Finally, we found that when the hyper-parameters are estimated from the data, the performance of image restoration worsens as averaging is undertaken. We conclude that averaging adversely influences the performance of image analysis through hyper-parameter estimation.

  6. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  7. 3-D image pre-processing algorithms for improved automated tracing of neuronal arbors.

    Science.gov (United States)

    Narayanaswamy, Arunachalam; Wang, Yu; Roysam, Badrinath

    2011-09-01

    The accuracy and reliability of automated neurite tracing systems is ultimately limited by image quality as reflected in the signal-to-noise ratio, contrast, and image variability. This paper describes a novel combination of image processing methods that operate on images of neurites captured by confocal and widefield microscopy, and produce synthetic images that are better suited to automated tracing. The algorithms are based on the curvelet transform (for denoising curvilinear structures and local orientation estimation), perceptual grouping by scalar voting (for elimination of non-tubular structures and improvement of neurite continuity while preserving branch points), adaptive focus detection, and depth estimation (for handling widefield images without deconvolution). The proposed methods are fast, and capable of handling large images. Their ability to handle images of unlimited size derives from automated tiling of large images along the lateral dimension, and processing of 3-D images one optical slice at a time. Their speed derives in part from the fact that the core computations are formulated in terms of the Fast Fourier Transform (FFT), and in part from parallel computation on multi-core computers. The methods are simple to apply to new images since they require very few adjustable parameters, all of which are intuitive. Examples of pre-processing DIADEM Challenge images are used to illustrate improved automated tracing resulting from our pre-processing methods.

  8. Safe and sensible preprocessing and baseline correction of pupil-size data.

    Science.gov (United States)

    Mathôt, Sebastiaan; Fabius, Jasper; Van Heusden, Elle; Van der Stigchel, Stefan

    2018-02-01

    Measurement of pupil size (pupillometry) has recently gained renewed interest from psychologists, but there is little agreement on how pupil-size data is best analyzed. Here we focus on one aspect of pupillometric analyses: baseline correction, i.e., analyzing changes in pupil size relative to a baseline period. Baseline correction is useful in experiments that investigate the effect of some experimental manipulation on pupil size. In such experiments, baseline correction improves statistical power by taking into account random fluctuations in pupil size over time. However, we show that baseline correction can also distort data if unrealistically small pupil sizes are recorded during the baseline period, which can easily occur due to eye blinks, data loss, or other distortions. Divisive baseline correction (corrected pupil size = pupil size/baseline) is affected more strongly by such distortions than subtractive baseline correction (corrected pupil size = pupil size - baseline). We discuss the role of baseline correction as a part of preprocessing of pupillometric data, and make five recommendations: (1) before baseline correction, perform data preprocessing to mark missing and invalid data, but assume that some distortions will remain in the data; (2) use subtractive baseline correction; (3) visually compare your corrected and uncorrected data; (4) be wary of pupil-size effects that emerge faster than the latency of the pupillary response allows (within ±220 ms after the manipulation that induces the effect); and (5) remove trials on which baseline pupil size is unrealistically small (indicative of blinks and other distortions).

  9. THE IMAGE REGISTRATION OF FOURIER-MELLIN BASED ON THE COMBINATION OF PROJECTION AND GRADIENT PREPROCESSING

    Directory of Open Access Journals (Sweden)

    D. Gao

    2017-09-01

    Full Text Available Image registration is one of the most important applications in the field of image processing. The method of Fourier Merlin transform, which has the advantages of high precision and good robustness to change in light and shade, partial blocking, noise influence and so on, is widely used. However, not only this method can’t obtain the unique mutual power pulse function for non-parallel image pairs, even part of image pairs also can’t get the mutual power function pulse. In this paper, an image registration method based on Fourier-Mellin transformation in the view of projection-gradient preprocessing is proposed. According to the projection conformational equation, the method calculates the matrix of image projection transformation to correct the tilt image; then, gradient preprocessing and Fourier-Mellin transformation are performed on the corrected image to obtain the registration parameters. Eventually, the experiment results show that the method makes the image registration of Fourier-Mellin transformation not only applicable to the registration of the parallel image pairs, but also to the registration of non-parallel image pairs. What’s more, the better registration effect can be obtained

  10. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  11. The effects of pre-processing strategies in sentiment analysis of online movie reviews

    Science.gov (United States)

    Zin, Harnani Mat; Mustapha, Norwati; Murad, Masrah Azrifah Azmi; Sharef, Nurfadhlina Mohd

    2017-10-01

    With the ever increasing of internet applications and social networking sites, people nowadays can easily express their feelings towards any products and services. These online reviews act as an important source for further analysis and improved decision making. These reviews are mostly unstructured by nature and thus, need processing like sentiment analysis and classification to provide a meaningful information for future uses. In text analysis tasks, the appropriate selection of words/features will have a huge impact on the effectiveness of the classifier. Thus, this paper explores the effect of the pre-processing strategies in the sentiment analysis of online movie reviews. In this paper, supervised machine learning method was used to classify the reviews. The support vector machine (SVM) with linear and non-linear kernel has been considered as classifier for the classification of the reviews. The performance of the classifier is critically examined based on the results of precision, recall, f-measure, and accuracy. Two different features representations were used which are term frequency and term frequency-inverse document frequency. Results show that the pre-processing strategies give a significant impact on the classification process.

  12. Revolving fund for energy saving

    International Nuclear Information System (INIS)

    Prebensen, K.

    1993-01-01

    A key issue in Eastern Europe is the adjustment of prices from the former COMECON level to a level conforming with free market conditions. In the case of household heating, this issue involves the removal of government subsidies leading to sharply increasing prices, metering of individual consumption, improving the efficiency of energy production, distribution and use - where savings of 30-50% in each link are technically feasible - thereby providing a potential for a adapting consumption patterns to higher energy prices, provided that funds are available. Currently, investment in commercial heat production and distribution systems have received substantial international support - whereas investment in reduction of demand has been little exploited. The Revolving Fund for Energy Savings in Polish Households is a concept for efficient financing of small-scale projects. It aims at financing, on the level of housing cooperatives, on the basis of a simplified lending and project evaluation procedure, well suited to current Polish conditions or an organizationally and financially weak banking system and little developed technical knowledge in the field of energy saving. A general introduction to the issue is given and technical problems are elaborated. The implementation of Energy Savings in Housing seen from the banking point of view, and a current pilot scheme for financing, are described. (AB)

  13. Social Capital and Savings Behaviour

    DEFF Research Database (Denmark)

    Newman, Carol; Tarp, Finn; Van Den Broeck, Katleen

    We explore the extent to which social capital can play a role in imparting information about the returns to saving where potential knowledge gaps and mistrust exists. Using data from Vietnam we find strong evidence to support the hypothesis that information transmitted via reputable social...

  14. Save the Boulders Beach Penguins

    Science.gov (United States)

    Sheerer, Katherine; Schnittka, Christine

    2012-01-01

    Maybe it's the peculiar way they walk or their cute little suits, but students of all ages are drawn to penguins. To meet younger students' curiosity, the authors adapted a middle-school level, penguin-themed curriculum unit called Save the Penguins (Schnittka, Bell, and Richards 2010) for third-grade students. The students loved learning about…

  15. Comparison of planar images and SPECT with bayesean preprocessing for the demonstration of facial anatomy and craniomandibular disorders

    International Nuclear Information System (INIS)

    Kircos, L.T.; Ortendahl, D.A.; Hattner, R.S.; Faulkner, D.; Taylor, R.L.

    1984-01-01

    Craniomandiublar disorders involving the facial anatomy may be difficult to demonstrate in planar images. Although bone scanning is generally more sensitive than radiography, facial bone anatomy is complex and focal areas of increased or decreased radiotracer may become obscured by overlapping structures in planar images. Thus SPECT appears ideally suited to examination of the facial skeleton. A series of patients with craniomandibular disorders of unknown origin were imaged using 20 mCi Tc-99m MDP. Planar and SPECT (Siemens 7500 ZLC Orbiter) images were obtained four hours after injection. The SPECT images were reconstructed with a filtered back-projection algorithm. In order to improve image contrast and resolution in SPECT images, the rotation views were pre-processed with a Bayesean deblurring algorithm which has previously been show to offer improved contrast and resolution in planar images. SPECT images using the pre-processed rotation views were obtained and compared to the SPECT images without pre-processing and the planar images. TMJ arthropathy involving either the glenoid fossa or the mandibular condyle, orthopedic changes involving the mandible or maxilla, localized dental pathosis, as well as changes in structures peripheral to the facial skeleton were identified. Bayesean pre-processed SPECT depicted the facial skeleton more clearly as well as providing a more obvious demonstration of the bony changes associated with craniomandibular disorders than either planar images or SPECT without pre-processing

  16. 76 FR 31680 - General Reporting and Recordkeeping by Savings Associations and Savings and Loan Holding Companies

    Science.gov (United States)

    2011-06-01

    ... Savings Associations and Savings and Loan Holding Companies AGENCY: Office of Thrift Supervision (OTS... Savings and Loan Holding Companies. OMB Number: 1550-0011. Form Number: N/A. Description: This information...), 12 CFR 562.4 (audit of savings association, savings and loan holding company, or affiliate), 12 CFR...

  17. Clinical data miner: an electronic case report form system with integrated data preprocessing and machine-learning libraries supporting clinical diagnostic model research.

    Science.gov (United States)

    Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk

    2014-10-20

    Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries

  18. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    Science.gov (United States)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The

  19. Large Hospital 50% Energy Savings: Technical Support Document

    Energy Technology Data Exchange (ETDEWEB)

    Bonnema, E.; Studer, D.; Parker, A.; Pless, S.; Torcellini, P.

    2010-09-01

    This Technical Support Document documents the technical analysis and design guidance for large hospitals to achieve whole-building energy savings of at least 50% over ANSI/ASHRAE/IESNA Standard 90.1-2004 and represents a step toward determining how to provide design guidance for aggressive energy savings targets. This report documents the modeling methods used to demonstrate that the design recommendations meet or exceed the 50% goal. EnergyPlus was used to model the predicted energy performance of the baseline and low-energy buildings to verify that 50% energy savings are achievable. Percent energy savings are based on a nominal minimally code-compliant building and whole-building, net site energy use intensity. The report defines architectural-program characteristics for typical large hospitals, thereby defining a prototype model; creates baseline energy models for each climate zone that are elaborations of the prototype models and are minimally compliant with Standard 90.1-2004; creates a list of energy design measures that can be applied to the prototype model to create low-energy models; uses industry feedback to strengthen inputs for baseline energy models and energy design measures; and simulates low-energy models for each climate zone to show that when the energy design measures are applied to the prototype model, 50% energy savings (or more) are achieved.

  20. Nuclear data for fusion: Validation of typical pre-processing methods for radiation transport calculations

    International Nuclear Information System (INIS)

    Hutton, T.; Sublet, J.C.; Morgan, L.; Leadbeater, T.W.

    2015-01-01

    Highlights: • We quantify the effect of processing nuclear data from ENDF to ACE format. • We consider the differences between fission and fusion angular distributions. • C-nat(n,el) at 2.0 MeV has a 0.6% deviation between original and processed data. • Fe-56(n,el) at 14.1 MeV has a 11.0% deviation between original and processed data. • Processed data do not accurately depict ENDF distributions for fusion energies. - Abstract: Nuclear data form the basis of the radiation transport codes used to design and simulate the behaviour of nuclear facilities, such as the ITER and DEMO fusion reactors. Typically these data and codes are biased towards fission and high-energy physics applications yet are still applied to fusion problems. With increasing interest in fusion applications, the lack of fusion specific codes and relevant data libraries is becoming increasingly apparent. Industry standard radiation transport codes require pre-processing of the evaluated data libraries prior to use in simulation. Historically these methods focus on speed of simulation at the cost of accurate data representation. For legacy applications this has not been a major concern, but current fusion needs differ significantly. Pre-processing reconstructs the differential and double differential interaction cross sections with a coarse binned structure, or more recently as a tabulated cumulative distribution function. This work looks at the validity of applying these processing methods to data used in fusion specific calculations in comparison to fission. The relative effects of applying this pre-processing mechanism, to both fission and fusion relevant reaction channels are demonstrated, and as such the poor representation of these distributions for the fusion energy regime. For the nat C(n,el) reaction at 2.0 MeV, the binned differential cross section deviates from the original data by 0.6% on average. For the 56 Fe(n,el) reaction at 14.1 MeV, the deviation increases to 11.0%. We

  1. Bayesian Optimization for Neuroimaging Pre-processing in Brain Age Classification and Prediction

    Directory of Open Access Journals (Sweden)

    Jenessa Lancaster

    2018-02-01

    Full Text Available Neuroimaging-based age prediction using machine learning is proposed as a biomarker of brain aging, relating to cognitive performance, health outcomes and progression of neurodegenerative disease. However, even leading age-prediction algorithms contain measurement error, motivating efforts to improve experimental pipelines. T1-weighted MRI is commonly used for age prediction, and the pre-processing of these scans involves normalization to a common template and resampling to a common voxel size, followed by spatial smoothing. Resampling parameters are often selected arbitrarily. Here, we sought to improve brain-age prediction accuracy by optimizing resampling parameters using Bayesian optimization. Using data on N = 2003 healthy individuals (aged 16–90 years we trained support vector machines to (i distinguish between young (<22 years and old (>50 years brains (classification and (ii predict chronological age (regression. We also evaluated generalisability of the age-regression model to an independent dataset (CamCAN, N = 648, aged 18–88 years. Bayesian optimization was used to identify optimal voxel size and smoothing kernel size for each task. This procedure adaptively samples the parameter space to evaluate accuracy across a range of possible parameters, using independent sub-samples to iteratively assess different parameter combinations to arrive at optimal values. When distinguishing between young and old brains a classification accuracy of 88.1% was achieved, (optimal voxel size = 11.5 mm3, smoothing kernel = 2.3 mm. For predicting chronological age, a mean absolute error (MAE of 5.08 years was achieved, (optimal voxel size = 3.73 mm3, smoothing kernel = 3.68 mm. This was compared to performance using default values of 1.5 mm3 and 4mm respectively, resulting in MAE = 5.48 years, though this 7.3% improvement was not statistically significant. When assessing generalisability, best performance was achieved when applying the entire Bayesian

  2. Data Pre-Processing Method to Remove Interference of Gas Bubbles and Cell Clusters During Anaerobic and Aerobic Yeast Fermentations in a Stirred Tank Bioreactor

    Science.gov (United States)

    Princz, S.; Wenzel, U.; Miller, R.; Hessling, M.

    2014-11-01

    One aerobic and four anaerobic batch fermentations of the yeast Saccharomyces cerevisiae were conducted in a stirred bioreactor and monitored inline by NIR spectroscopy and a transflectance dip probe. From the acquired NIR spectra, chemometric partial least squares regression (PLSR) models for predicting biomass, glucose and ethanol were constructed. The spectra were directly measured in the fermentation broth and successfully inspected for adulteration using our novel data pre-processing method. These adulterations manifested as strong fluctuations in the shape and offset of the absorption spectra. They resulted from cells, cell clusters, or gas bubbles intercepting the optical path of the dip probe. In the proposed data pre-processing method, adulterated signals are removed by passing the time-scanned non-averaged spectra through two filter algorithms with a 5% quantile cutoff. The filtered spectra containing meaningful data are then averaged. A second step checks whether the whole time scan is analyzable. If true, the average is calculated and used to prepare the PLSR models. This new method distinctly improved the prediction results. To dissociate possible correlations between analyte concentrations, such as glucose and ethanol, the feeding analytes were alternately supplied at different concentrations (spiking) at the end of the four anaerobic fermentations. This procedure yielded low-error (anaerobic) PLSR models for predicting analyte concentrations of 0.31 g/l for biomass, 3.41 g/l for glucose, and 2.17 g/l for ethanol. The maximum concentrations were 14 g/l biomass, 167 g/l glucose, and 80 g/l ethanol. Data from the aerobic fermentation, carried out under high agitation and high aeration, were incorporated to realize combined PLSR models, which have not been previously reported to our knowledge.

  3. Effects of different correlation metrics and preprocessing factors on small-world brain functional networks: a resting-state functional MRI study.

    Science.gov (United States)

    Liang, Xia; Wang, Jinhui; Yan, Chaogan; Shu, Ni; Xu, Ke; Gong, Gaolang; He, Yong

    2012-01-01

    Graph theoretical analysis of brain networks based on resting-state functional MRI (R-fMRI) has attracted a great deal of attention in recent years. These analyses often involve the selection of correlation metrics and specific preprocessing steps. However, the influence of these factors on the topological properties of functional brain networks has not been systematically examined. Here, we investigated the influences of correlation metric choice (Pearson's correlation versus partial correlation), global signal presence (regressed or not) and frequency band selection [slow-5 (0.01-0.027 Hz) versus slow-4 (0.027-0.073 Hz)] on the topological properties of both binary and weighted brain networks derived from them, and we employed test-retest (TRT) analyses for further guidance on how to choose the "best" network modeling strategy from the reliability perspective. Our results show significant differences in global network metrics associated with both correlation metrics and global signals. Analysis of nodal degree revealed differing hub distributions for brain networks derived from Pearson's correlation versus partial correlation. TRT analysis revealed that the reliability of both global and local topological properties are modulated by correlation metrics and the global signal, with the highest reliability observed for Pearson's-correlation-based brain networks without global signal removal (WOGR-PEAR). The nodal reliability exhibited a spatially heterogeneous distribution wherein regions in association and limbic/paralimbic cortices showed moderate TRT reliability in Pearson's-correlation-based brain networks. Moreover, we found that there were significant frequency-related differences in topological properties of WOGR-PEAR networks, and brain networks derived in the 0.027-0.073 Hz band exhibited greater reliability than those in the 0.01-0.027 Hz band. Taken together, our results provide direct evidence regarding the influences of correlation metrics and specific

  4. Faith community nursing: real care, real cost savings.

    Science.gov (United States)

    Yeaworth, Rosalee C; Sailors, Ronnette

    2014-01-01

    At a time when healthcare costs are increasing more than other aspects of the economy, churches are stepping up to help fill needs through congregational health ministries. Faith Community Nursing (FCN) is a rapidly growing health service in the churches of many denominations. This article documents healthcare services and financial savings provided by FCNs and health ministries, showing the critical role faith community nursing can play in containing healthcare costs.

  5. A simpler method of preprocessing MALDI-TOF MS data for differential biomarker analysis: stem cell and melanoma cancer studies

    Directory of Open Access Journals (Sweden)

    Tong Dong L

    2011-09-01

    Full Text Available Abstract Introduction Raw spectral data from matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF with MS profiling techniques usually contains complex information not readily providing biological insight into disease. The association of identified features within raw data to a known peptide is extremely difficult. Data preprocessing to remove uncertainty characteristics in the data is normally required before performing any further analysis. This study proposes an alternative yet simple solution to preprocess raw MALDI-TOF-MS data for identification of candidate marker ions. Two in-house MALDI-TOF-MS data sets from two different sample sources (melanoma serum and cord blood plasma are used in our study. Method Raw MS spectral profiles were preprocessed using the proposed approach to identify peak regions in the spectra. The preprocessed data was then analysed using bespoke machine learning algorithms for data reduction and ion selection. Using the selected ions, an ANN-based predictive model was constructed to examine the predictive power of these ions for classification. Results Our model identified 10 candidate marker ions for both data sets. These ion panels achieved over 90% classification accuracy on blind validation data. Receiver operating characteristics analysis was performed and the area under the curve for melanoma and cord blood classifiers was 0.991 and 0.986, respectively. Conclusion The results suggest that our data preprocessing technique removes unwanted characteristics of the raw data, while preserving the predictive components of the data. Ion identification analysis can be carried out using MALDI-TOF-MS data with the proposed data preprocessing technique coupled with bespoke algorithms for data reduction and ion selection.

  6. Evaluation of the robustness of the preprocessing technique improving reversible compressibility of CT images: Tested on various CT examinations

    Energy Technology Data Exchange (ETDEWEB)

    Jeon, Chang Ho; Kim, Bohyoung; Gu, Bon Seung; Lee, Jong Min [Department of Radiology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, 300 Gumi-ro, Bundang-gu, Seongnam-si, Gyeonggi-do 463-707 (Korea, Republic of); Kim, Kil Joong [Department of Radiology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, 300 Gumi-ro, Bundang-gu, Seongnam-si, Gyeonggi-do 463-707, South Korea and Department of Radiation Applied Life Science, Seoul National University College of Medicine, 103 Daehak-ro, Jongno-gu, Seoul 110-799 (Korea, Republic of); Lee, Kyoung Ho [Department of Radiology, Seoul National University Bundang Hospital, Seoul National University College of Medicine, 300 Gumi-ro, Bundang-gu, Seongnam-si, Gyeonggi-do 463-707, South Korea and Institute of Radiation Medicine, Seoul National University Medical Research Center, and Clinical Research Institute, Seoul National University Hospital, 101 Daehak-ro, Jongno-gu, Seoul 110-744 (Korea, Republic of); Kim, Tae Ki [Medical Information Center, Seoul National University Bundang Hospital, Seoul National University College of Medicine, 300 Gumi-ro, Bundang-gu, Seongnam-si, Gyeonggi-do 463-707 (Korea, Republic of)

    2013-10-15

    Purpose: To modify the preprocessing technique, which was previously proposed, improving compressibility of computed tomography (CT) images to cover the diversity of three dimensional configurations of different body parts and to evaluate the robustness of the technique in terms of segmentation correctness and increase in reversible compression ratio (CR) for various CT examinations.Methods: This study had institutional review board approval with waiver of informed patient consent. A preprocessing technique was previously proposed to improve the compressibility of CT images by replacing pixel values outside the body region with a constant value resulting in maximizing data redundancy. Since the technique was developed aiming at only chest CT images, the authors modified the segmentation method to cover the diversity of three dimensional configurations of different body parts. The modified version was evaluated as follows. In randomly selected 368 CT examinations (352 787 images), each image was preprocessed by using the modified preprocessing technique. Radiologists visually confirmed whether the segmented region covers the body region or not. The images with and without the preprocessing were reversibly compressed using Joint Photographic Experts Group (JPEG), JPEG2000 two-dimensional (2D), and JPEG2000 three-dimensional (3D) compressions. The percentage increase in CR per examination (CR{sub I}) was measured.Results: The rate of correct segmentation was 100.0% (95% CI: 99.9%, 100.0%) for all the examinations. The median of CR{sub I} were 26.1% (95% CI: 24.9%, 27.1%), 40.2% (38.5%, 41.1%), and 34.5% (32.7%, 36.2%) in JPEG, JPEG2000 2D, and JPEG2000 3D, respectively.Conclusions: In various CT examinations, the modified preprocessing technique can increase in the CR by 25% or more without concerning about degradation of diagnostic information.

  7. Joint preprocesser-based detector for cooperative networks with limited hardware processing capability

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2015-02-01

    In this letter, a joint detector for cooperative communication networks is proposed when the destination has limited hardware processing capability. The transmitter sends its symbols with the help of L relays. As the destination has limited hardware, only U out of L signals are processed and the energy of the remaining relays is lost. To solve this problem, a joint preprocessing based detector is proposed. This joint preprocessor based detector operate on the principles of minimizing the symbol error rate (SER). For a realistic assessment, pilot symbol aided channel estimation is incorporated for this proposed detector. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Our proposed scheme has low computational complexity.

  8. Flexible high-speed FASTBUS master for data read-out and preprocessing

    International Nuclear Information System (INIS)

    Wurz, A.; Manner, R.

    1990-01-01

    This paper describes a single slot FASTBUS master module. It can be used for read-out and preprocessing of data that are read out from FASTBUS modules, e.g., and ADC system. The module consists of a 25 MHz, 32-bit processor MC 68030 with cache memory and memory management, a floating point coprocessor MC68882, 4 MBytes of main memory, and FASTBUS master and slave interfaces. In addition, a DMA controller for read-out of FASTBUS data is provided. The processor allows I/O via serial ports, a 16-bit parallel port, and a transputer link. Additional interfaces are planned. The main memory is multi-ported and can be accessed directly by the CPU, the FASTBUS, and external masters via the high-speed local bus that is accessible by way of a connector. The FASTBUS interface supports most of the standard operations in master and slave mode

  9. Combined principal component preprocessing and n-tuple neural networks for improved classification

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Linneberg, Christian

    2000-01-01

    We present a combined principal component analysis/neural network scheme for classification. The data used to illustrate the method consist of spectral fluorescence recordings from seven different production facilities, and the task is to relate an unknown sample to one of these seven factories....... The data are first preprocessed by performing an individual principal component analysis on each of the seven groups of data. The components found are then used for classifying the data, but instead of making a single multiclass classifier, we follow the ideas of turning a multiclass problem into a number...... of two-class problems. For each possible pair of classes we further apply a transformation to the calculated principal components in order to increase the separation between the classes. Finally we apply the so-called n-tuple neural network to the transformed data in order to give the classification...

  10. FY 1997 cost savings report

    International Nuclear Information System (INIS)

    Sellards, J.B.

    1998-01-01

    With the end of the cold war, funding for the Environmental Management program increased rapidly as nuclear weapons production facilities were shut down, cleanup responsibilities increased, and facilities were transferred to the cleanup program. As funding for the Environmental Management (EM) program began to level off in response to Administration and Congressional efforts to balance the Federal budget, the program redoubled its efforts to increase efficiency and get more productivity out of every dollar. Cost savings and enhanced performance are an integral pair of Hanford Site operations. FY1997 was the third year of a cost savings program that was initially defined in FY 1995. The definitions and process remained virtually the same as those used in FY 1996

  11. Energy savings in Polish buildings

    Energy Technology Data Exchange (ETDEWEB)

    Markel, L.C.; Gula, A.; Reeves, G.

    1995-12-31

    A demonstration of low-cost insulation and weatherization techniques was a part of phase 1 of the Krakow Clean Fossil Fuels and Energy Efficient Project. The objectives were to identify a cost-effective set of measures to reduce energy used for space heating, determine how much energy could be saved, and foster widespread implementation of those measures. The demonstration project focused on 4 11-story buildings in a Krakow housing cooperative. Energy savings of over 20% were obtained. Most important, the procedures and materials implemented in the demonstration project have been adapted to Polish conditions and applied to other housing cooperatives, schools, and hospitals. Additional projects are being planned, in Krakow and other cities, under the direction of FEWE-Krakow, the Polish Energie Cities Network, and Biuro Rozwoju Krakowa.

  12. Water Saving Strategies & Ecological Modernisation

    DEFF Research Database (Denmark)

    Hoffmann, Birgitte; Jensen, Jesper Ole; Elle, Morten

    2005-01-01

    Drawing on case studies of water saving campaigns and new collaborations, the pa-per will serve, on the one hand, as an interpretation of the water saving strategy in Co-penhagen in the light of Ecological Modernisation, and on the other hand, as a critical discussion of Ecological Modernisation...... as a frame for understanding resource manage-ment. The water management in Copenhagen has in recent years undergone a rather radi-cal transition. Along with strong drivers for resource management in the region the mu-nicipal water supplier has tested and implemented a number of initiatives to promote sus...... to 125 l/capita/day in 2002. A series of different strategies, targets and tools have been implemented: Emphasizing demand side instead of supply side, using and communicating indicators, formulating goals for reducing water consumption and developing learning processes in water management. A main...

  13. Cost savings through innovative technologies

    International Nuclear Information System (INIS)

    Lankford, J.M.; Jackson, J.P.

    1995-01-01

    Newly developed technologies can and already are saving money. Other technologies under development will provide solutions to problems which are currently impossible or too expensive to address, and still others will offer alternative strategies where baseline approaches are not acceptable to the public. All of these options will be considered as the nation decides what it wishes to accomplish, and fund, to clean up the nuclear weapons complex

  14. chipPCR: an R package to pre-process raw data of amplification curves.

    Science.gov (United States)

    Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter

    2015-09-01

    Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  15. Measure Guideline: Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A.; Easley, S.

    2012-05-01

    The report evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provide a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.

  16. Measure Guideline. Replacing Single-Speed Pool Pumps with Variable Speed Pumps for Energy Savings

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, A. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States); Easley, S. [Building Media and the Building America Retrofit Alliance (BARA), Wilmington, DE (United States)

    2012-05-01

    This measure guideline evaluates potential energy savings by replacing traditional single-speed pool pumps with variable speed pool pumps, and provides a basic cost comparison between continued uses of traditional pumps verses new pumps. A simple step-by-step process for inspecting the pool area and installing a new pool pump follows.

  17. Energy Savings in a Market Economy

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen

    1998-01-01

    The paper outlines the concept of energy savings as opposed to energy efficency. Afterwards are described briefly the up and down role of energy savings in recent Danish energy policy. It discusses the failure of leaving electricity savings and Integrated Resource Planning to the electricity...

  18. 24 CFR 221.1 - Savings clause.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 2 2010-04-01 2010-04-01 false Savings clause. 221.1 Section 221.1... MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES LOW COST AND MODERATE INCOME MORTGAGE INSURANCE-SAVINGS CLAUSE Eligibility Requirements-Low Cost Homes-Savings Clause § 221.1...

  19. Consumer behaviours: Teaching children to save energy

    Science.gov (United States)

    Grønhøj, Alice

    2016-08-01

    Energy-saving programmes are increasingly targeted at children to encourage household energy conservation. A study involving the assignment of energy-saving interventions to Girl Scouts shows that a child-focused intervention can improve energy-saving behaviours among children and their parents.

  20. The recursive combination filter approach of pre-processing for the estimation of standard deviation of RR series.

    Science.gov (United States)

    Mishra, Alok; Swati, D

    2015-09-01

    Variation in the interval between the R-R peaks of the electrocardiogram represents the modulation of the cardiac oscillations by the autonomic nervous system. This variation is contaminated by anomalous signals called ectopic beats, artefacts or noise which mask the true behaviour of heart rate variability. In this paper, we have proposed a combination filter of recursive impulse rejection filter and recursive 20% filter, with recursive application and preference of replacement over removal of abnormal beats to improve the pre-processing of the inter-beat intervals. We have tested this novel recursive combinational method with median method replacement to estimate the standard deviation of normal to normal (SDNN) beat intervals of congestive heart failure (CHF) and normal sinus rhythm subjects. This work discusses the improvement in pre-processing over single use of impulse rejection filter and removal of abnormal beats for heart rate variability for the estimation of SDNN and Poncaré plot descriptors (SD1, SD2, and SD1/SD2) in detail. We have found the 22 ms value of SDNN and 36 ms value of SD2 descriptor of Poincaré plot as clinical indicators in discriminating the normal cases from CHF cases. The pre-processing is also useful in calculation of Lyapunov exponent which is a nonlinear index as Lyapunov exponents calculated after proposed pre-processing modified in a way that it start following the notion of less complex behaviour of diseased states.

  1. On image pre-processing for PIV of sinlge- and two-phase flows over reflecting objects

    NARCIS (Netherlands)

    Deen, N.G.; Willems, P.; van Sint Annaland, M.; Kuipers, J.A.M.; Lammertink, Rob G.H.; Kemperman, Antonius J.B.; Wessling, Matthias; van der Meer, Walterus Gijsbertus Joseph

    2010-01-01

    A novel image pre-processing scheme for PIV of single- and two-phase flows over reflecting objects which does not require the use of additional hardware is discussed. The approach for single-phase flow consists of image normalization and intensity stretching followed by background subtraction. For

  2. CudaPre3D: An Alternative Preprocessing Algorithm for Accelerating 3D Convex Hull Computation on the GPU

    Directory of Open Access Journals (Sweden)

    MEI, G.

    2015-05-01

    Full Text Available In the calculating of convex hulls for point sets, a preprocessing procedure that is to filter the input points by discarding non-extreme points is commonly used to improve the computational efficiency. We previously proposed a quite straightforward preprocessing approach for accelerating 2D convex hull computation on the GPU. In this paper, we extend that algorithm to being used in 3D cases. The basic ideas behind these two preprocessing algorithms are similar: first, several groups of extreme points are found according to the original set of input points and several rotated versions of the input set; then, a convex polyhedron is created using the found extreme points; and finally those interior points locating inside the formed convex polyhedron are discarded. Experimental results show that: when employing the proposed preprocessing algorithm, it achieves the speedups of about 4x on average and 5x to 6x in the best cases over the cases where the proposed approach is not used. In addition, more than 95 percent of the input points can be discarded in most experimental tests.

  3. Energy saving consulting in Hamburg

    Energy Technology Data Exchange (ETDEWEB)

    1982-10-01

    For anyone who wants to realise the dream of his own house, the terms of thermal insulation and saving heating plant should be central in planning this. One needs advice from experts for this. A survey of the many consultants offices available in Hamburg is provided. The list was compiled with the assistance of the Hamburg Chamber of Commerce and the Hamburg Trades Council and of professional associations. The information on the special fields of activity of the named consultants is based on their statements.

  4. Medical savings accounts make waves.

    Science.gov (United States)

    Gardner, J

    1995-02-27

    MSAs: the theory. Medical savings account legislation would allow consumers to set aside pre-tax dollars to pay for day-to-day healthcare costs. The accounts are to be backed up by a catastropic policy with a deductible roughly equal to the maximum amount allowed in the MSA. The aim is to reduce healthcare cost inflation by making consumers more aware of the costs of healthcare than they are under comprehensive policies and enabling them to shop for the lowest-cost, highest-quality care.

  5. Student saving, does it exist? : A study of students' saving behavior, attitude towards saving and motivation to save.

    OpenAIRE

    Tuvesson, Joakim; Yu, Shiyu

    2011-01-01

    Swedish households are getting deeper in debt and house prices keeps on rising. This is what happened in USA and it was one of the major causes of the recent financial crisis. To avoid a similar crisis in Sweden we think one part of the solution is to make sure that those who are students today and soon will get jobs, buy houses, take loans etcetera have necessary knowledge to do so. Students’ saving is an area that almost completely lacked researchers’ attention, and one goal with this thesi...

  6. Value of travel time savings

    DEFF Research Database (Denmark)

    Le Masurier, P.; Polak, J.; Pawlak, Janet

    2015-01-01

    A team of specialist market researchers and Value of Time experts comprising members from SYSTRA, Imperial College London and the Technical University of Denmark has conducted a formal audit and peer review of research undertaken by Arup/ITS Leeds/Accent to derive Value of Travel Time Savings...... Preference (RP) models that were used to derive final Values of Travel Time (VTT). This report contains the findings of our audit and peer review of the procedures adopted by the research team during data collection of the three surveys (SP, RP and Employers Surveys); a peer review of the reported approach...

  7. Internship guide : Work placements step by step

    NARCIS (Netherlands)

    Haag, Esther

    2013-01-01

    Internship Guide: Work Placements Step by Step has been written from the practical perspective of a placement coordinator. This book addresses the following questions : what problems do students encounter when they start thinking about the jobs their degree programme prepares them for? How do you

  8. The way to collisions, step by step

    CERN Multimedia

    2009-01-01

    While the LHC sectors cool down and reach the cryogenic operating temperature, spirits are warming up as we all eagerly await the first collisions. No reason to hurry, though. Making particles collide involves the complex manoeuvring of thousands of delicate components. The experts will make it happen using a step-by-step approach.

  9. Connecting possibilistic prudence and optimal saving

    Directory of Open Access Journals (Sweden)

    Ana María Lucia Casademunt

    2013-12-01

    Full Text Available In this paper we study the optimal saving problem in the framework of possibility theory. The notion of possibilistic precautionary saving is introduced as a measure of the way the presence of possibilistic risk (represented by a fuzzy number influences a consumer in establishing the level of optimal saving. The notion of prudence of an agent in the face of possibilistic risk is defined and the equivalence between the prudence condition and a positive possibilistic precautionary saving is proved. Some relations between possibilistic risk aversion, prudence and possibilistic precautionary saving were established.

  10. Energy Savings Measure Packages. Existing Homes

    Energy Technology Data Exchange (ETDEWEB)

    Casey, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Booten, Chuck [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2011-11-01

    This document presents the most cost effective Energy Savings Measure Packages (ESMP) for existing mixed-fuel and all electric homes to achieve 15% and 30% savings for each BetterBuildings grantee location across the United States. These packages are optimized for minimum cost to homeowners for source energy savings given the local climate and prevalent building characteristics (i.e. foundation types). Maximum cost savings are typically found between 30% and 50% energy savings over the reference home; this typically amounts to $300 - $700/year.

  11. Acquiring and preprocessing leaf images for automated plant identification: understanding the tradeoff between effort and information gain

    Directory of Open Access Journals (Sweden)

    Michael Rzanny

    2017-11-01

    Full Text Available Abstract Background Automated species identification is a long term research subject. Contrary to flowers and fruits, leaves are available throughout most of the year. Offering margin and texture to characterize a species, they are the most studied organ for automated identification. Substantially matured machine learning techniques generate the need for more training data (aka leaf images. Researchers as well as enthusiasts miss guidance on how to acquire suitable training images in an efficient way. Methods In this paper, we systematically study nine image types and three preprocessing strategies. Image types vary in terms of in-situ image recording conditions: perspective, illumination, and background, while the preprocessing strategies compare non-preprocessed, cropped, and segmented images to each other. Per image type-preprocessing combination, we also quantify the manual effort required for their implementation. We extract image features using a convolutional neural network, classify species using the resulting feature vectors and discuss classification accuracy in relation to the required effort per combination. Results The most effective, non-destructive way to record herbaceous leaves is to take an image of the leaf’s top side. We yield the highest classification accuracy using destructive back light images, i.e., holding the plucked leaf against the sky for image acquisition. Cropping the image to the leaf’s boundary substantially improves accuracy, while precise segmentation yields similar accuracy at a substantially higher effort. The permanent use or disuse of a flash light has negligible effects. Imaging the typically stronger textured backside of a leaf does not result in higher accuracy, but notably increases the acquisition cost. Conclusions In conclusion, the way in which leaf images are acquired and preprocessed does have a substantial effect on the accuracy of the classifier trained on them. For the first time, this

  12. Scientific data products and the data pre-processing subsystem of the Chang'e-3 mission

    International Nuclear Information System (INIS)

    Tan Xu; Liu Jian-Jun; Li Chun-Lai; Feng Jian-Qing; Ren Xin; Wang Fen-Fei; Yan Wei; Zuo Wei; Wang Xiao-Qian; Zhang Zhou-Bin

    2014-01-01

    The Chang'e-3 (CE-3) mission is China's first exploration mission on the surface of the Moon that uses a lander and a rover. Eight instruments that form the scientific payloads have the following objectives: (1) investigate the morphological features and geological structures at the landing site; (2) integrated in-situ analysis of minerals and chemical compositions; (3) integrated exploration of the structure of the lunar interior; (4) exploration of the lunar-terrestrial space environment, lunar surface environment and acquire Moon-based ultraviolet astronomical observations. The Ground Research and Application System (GRAS) is in charge of data acquisition and pre-processing, management of the payload in orbit, and managing the data products and their applications. The Data Pre-processing Subsystem (DPS) is a part of GRAS. The task of DPS is the pre-processing of raw data from the eight instruments that are part of CE-3, including channel processing, unpacking, package sorting, calibration and correction, identification of geographical location, calculation of probe azimuth angle, probe zenith angle, solar azimuth angle, and solar zenith angle and so on, and conducting quality checks. These processes produce Level 0, Level 1 and Level 2 data. The computing platform of this subsystem is comprised of a high-performance computing cluster, including a real-time subsystem used for processing Level 0 data and a post-time subsystem for generating Level 1 and Level 2 data. This paper describes the CE-3 data pre-processing method, the data pre-processing subsystem, data classification, data validity and data products that are used for scientific studies

  13. Risk transfer via energy-savings insurance

    International Nuclear Information System (INIS)

    Mills, Evan

    2003-01-01

    Among the key barriers to investment in energy efficiency are uncertainties about attaining projected energy savings and potential disputes over stipulated savings. The fields of energy management and risk management are thus intertwined. While many technical methods have emerged to manage performance risks (e.g. building diagnostics and commissioning), financial methods are less developed in the energy management arena than in other segments of the economy. Energy-savings insurance (ESI) - formal insurance of predicted energy savings - transfers and spreads both types of risk over a larger pool of energy efficiency projects and reduces barriers to market entry of smaller energy service firms who lack sufficiently strong balance sheets to self-insure the savings. ESI encourages those implementing energy-saving projects to go beyond standard measures and thereby achieve more significant levels of energy savings. Insurance providers are proponents of improved savings measurement and verification techniques, as well as maintenance, thereby contributing to national energy-saving objectives. If properly applied, ESI can potentially reduce the net cost of energy-saving projects by reducing the interest rates charged by lenders, and by increasing the level of savings through quality control. Governmental agencies have been pioneers in the use of ESI and could continue to play a role

  14. Automatic pre-processing for an object-oriented distributed hydrological model using GRASS-GIS

    Science.gov (United States)

    Sanzana, P.; Jankowfsky, S.; Branger, F.; Braud, I.; Vargas, X.; Hitschfeld, N.

    2012-04-01

    Landscapes are very heterogeneous, which impact the hydrological processes occurring in the catchments, especially in the modeling of peri-urban catchments. The Hydrological Response Units (HRUs), resulting from the intersection of different maps, such as land use, soil types and geology, and flow networks, allow the representation of these elements in an explicit way, preserving natural and artificial contours of the different layers. These HRUs are used as model mesh in some distributed object-oriented hydrological models, allowing the application of a topological oriented approach. The connectivity between polygons and polylines provides a detailed representation of the water balance and overland flow in these distributed hydrological models, based on irregular hydro-landscape units. When computing fluxes between these HRUs, the geometrical parameters, such as the distance between the centroid of gravity of the HRUs and the river network, and the length of the perimeter, can impact the realism of the calculated overland, sub-surface and groundwater fluxes. Therefore, it is necessary to process the original model mesh in order to avoid these numerical problems. We present an automatic pre-processing implemented in the open source GRASS-GIS software, for which several Python scripts or some algorithms already available were used, such as the Triangle software. First, some scripts were developed to improve the topology of the various elements, such as snapping of the river network to the closest contours. When data are derived with remote sensing, such as vegetation areas, their perimeter has lots of right angles that were smoothed. Second, the algorithms more particularly address bad-shaped elements of the model mesh such as polygons with narrow shapes, marked irregular contours and/or the centroid outside of the polygons. To identify these elements we used shape descriptors. The convexity index was considered the best descriptor to identify them with a threshold

  15. Pre-processing, registration and selection of adaptive optics corrected retinal images.

    Science.gov (United States)

    Ramaswamy, Gomathy; Devaney, Nicholas

    2013-07-01

    In this paper, the aim is to demonstrate enhanced processing of sequences of fundus images obtained using a commercial AO flood illumination system. The purpose of the work is to (1) correct for uneven illumination at the retina (2) automatically select the best quality images and (3) precisely register the best images. Adaptive optics corrected retinal images are pre-processed to correct uneven illumination using different methods; subtracting or dividing by the average filtered image, homomorphic filtering and a wavelet based approach. These images are evaluated to measure the image quality using various parameters, including sharpness, variance, power spectrum kurtosis and contrast. We have carried out the registration in two stages; a coarse stage using cross-correlation followed by fine registration using two approaches; parabolic interpolation on the peak of the cross-correlation and maximum-likelihood estimation. The angle of rotation of the images is measured using a combination of peak tracking and Procrustes transformation. We have found that a wavelet approach (Daubechies 4 wavelet at 6th level decomposition) provides good illumination correction with clear improvement in image sharpness and contrast. The assessment of image quality using a 'Designer metric' works well when compared to visual evaluation, although it is highly correlated with other metrics. In image registration, sub-pixel translation measured using parabolic interpolation on the peak of the cross-correlation function and maximum-likelihood estimation are found to give very similar results (RMS difference 0.047 pixels). We have confirmed that correcting rotation of the images provides a significant improvement, especially at the edges of the image. We observed that selecting the better quality frames (e.g. best 75% images) for image registration gives improved resolution, at the expense of poorer signal-to-noise. The sharpness map of the registered and de-rotated images shows increased

  16. Energy saving synergies in national energy systems

    DEFF Research Database (Denmark)

    Thellufsen, Jakob Zinck; Lund, Henrik

    2015-01-01

    In the transition towards a 100% renewable energy system, energy savings are essential. The possibility of energy savings through conservation or efficiency increases can be identified in, for instance, the heating and electricity sectors, in industry, and in transport. Several studies point...... to various optimal levels of savings in the different sectors of the energy system. However, these studies do not investigate the idea of energy savings being system dependent. This paper argues that such system dependency is critical to understand, as it does not make sense to analyse an energy saving...... without taking into account the actual benefit of the saving in relation to the energy system. The study therefore identifies a need to understand how saving methods may interact with each other and the system in which they are conducted. By using energy system analysis to do hourly simulation...

  17. Saving in cycles: how to get people to save more money.

    Science.gov (United States)

    Tam, Leona; Dholakia, Utpal

    2014-02-01

    Low personal savings rates are an important social issue in the United States. We propose and test one particular method to get people to save more money that is based on the cyclical time orientation. In contrast to conventional, popular methods that encourage individuals to ignore past mistakes, focus on the future, and set goals to save money, our proposed method frames the savings task in cyclical terms, emphasizing the present. Across the studies, individuals who used our proposed cyclical savings method, compared with individuals who used a linear savings method, provided an average of 74% higher savings estimates and saved an average of 78% more money. We also found that the cyclical savings method was more efficacious because it increased implementation planning and lowered future optimism regarding saving money.

  18. SAVE ENERGY IN TEXTILE SMES

    Directory of Open Access Journals (Sweden)

    SCALIA Mauro

    2016-05-01

    Full Text Available Efficiency and competitiveness in textile and clothing manufacturing sector must take into account the current and future energy challenges. Energy efficiency is a subject of critical importance for the Textile & Clothing industry, for other sectors and for the society in general. EURATEX has initiated Energy Made-to-Measure, an information campaign running until 2016 to empower over 300 textile & clothing companies, notably SMEs, to become more energy efficient. SET( Save Energy in Textile SMEs a collaborative project co-funded within the European Programme Intelligent Energy Europe II helps companies to understand their energy consumption and allows them to compare the sector benchmarks in different production processes. SET has developed the SET tool, Energy Saving and Efficiency Tool, a free of charge tool customized for textile manufacturers. The SET tool is made up of 4 elements: a stand-alone software (SET Tool for self-assessment based on an Excel application; an on-line part (SET tool Web for advanced benchmarking and comparison of the performances across years; a guiding document for the companies and overview of financial incentives and legal obligations regarding energy efficiency. Designed specifically for small and medium enterprises (SMEs, the SET tool enables the evaluation of energy consumption and recommends measures to reduce the consumption. Prior to modifying the company’s production processes and making investments to increase energy efficiency, textile SMEs need to get different type of information, including legal context, economic and technical peculiarities.

  19. Microsoft Office professional 2010 step by step

    CERN Document Server

    Cox, Joyce; Frye, Curtis

    2011-01-01

    Teach yourself exactly what you need to know about using Office Professional 2010-one step at a time! With STEP BY STEP, you build and practice new skills hands-on, at your own pace. Covering Microsoft Word, PowerPoint, Outlook, Excel, Access, Publisher, and OneNote, this book will help you learn the core features and capabilities needed to: Create attractive documents, publications, and spreadsheetsManage your e-mail, calendar, meetings, and communicationsPut your business data to workDevelop and deliver great presentationsOrganize your ideas and notes in one placeConnect, share, and accom

  20. Tools and Databases of the KOMICS Web Portal for Preprocessing, Mining, and Dissemination of Metabolomics Data

    Directory of Open Access Journals (Sweden)

    Nozomu Sakurai

    2014-01-01

    Full Text Available A metabolome—the collection of comprehensive quantitative data on metabolites in an organism—has been increasingly utilized for applications such as data-intensive systems biology, disease diagnostics, biomarker discovery, and assessment of food quality. A considerable number of tools and databases have been developed to date for the analysis of data generated by various combinations of chromatography and mass spectrometry. We report here a web portal named KOMICS (The Kazusa Metabolomics Portal, where the tools and databases that we developed are available for free to academic users. KOMICS includes the tools and databases for preprocessing, mining, visualization, and publication of metabolomics data. Improvements in the annotation of unknown metabolites and dissemination of comprehensive metabolomic data are the primary aims behind the development of this portal. For this purpose, PowerGet and FragmentAlign include a manual curation function for the results of metabolite feature alignments. A metadata-specific wiki-based database, Metabolonote, functions as a hub of web resources related to the submitters' work. This feature is expected to increase citation of the submitters' work, thereby promoting data publication. As an example of the practical use of KOMICS, a workflow for a study on Jatropha curcas is presented. The tools and databases available at KOMICS should contribute to enhanced production, interpretation, and utilization of metabolomic Big Data.

  1. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  2. A comparative analysis of pre-processing techniques in colour retinal images

    International Nuclear Information System (INIS)

    Salvatelli, A; Bizai, G; Barbosa, G; Drozdowicz, B; Delrieux, C

    2007-01-01

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising

  3. Robust preprocessing for stimulus-based functional MRI of the moving fetus.

    Science.gov (United States)

    You, Wonsang; Evangelou, Iordanis E; Zun, Zungho; Andescavage, Nickie; Limperopoulos, Catherine

    2016-04-01

    Fetal motion manifests as signal degradation and image artifact in the acquired time series of blood oxygen level dependent (BOLD) functional magnetic resonance imaging (fMRI) studies. We present a robust preprocessing pipeline to specifically address fetal and placental motion-induced artifacts in stimulus-based fMRI with slowly cycled block design in the living fetus. In the proposed pipeline, motion correction is optimized to the experimental paradigm, and it is performed separately in each phase as well as in each region of interest (ROI), recognizing that each phase and organ experiences different types of motion. To obtain the averaged BOLD signals for each ROI, both misaligned volumes and noisy voxels are automatically detected and excluded, and the missing data are then imputed by statistical estimation based on local polynomial smoothing. Our experimental results demonstrate that the proposed pipeline was effective in mitigating the motion-induced artifacts in stimulus-based fMRI data of the fetal brain and placenta.

  4. The PREP Pipeline: Standardized preprocessing for large-scale EEG analysis

    Directory of Open Access Journals (Sweden)

    Nima eBigdelys Shamlo

    2015-06-01

    Full Text Available The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode/.

  5. Improving the performance of streamflow forecasting model using data-preprocessing technique in Dungun River Basin

    Science.gov (United States)

    Khai Tiu, Ervin Shan; Huang, Yuk Feng; Ling, Lloyd

    2018-03-01

    An accurate streamflow forecasting model is important for the development of flood mitigation plan as to ensure sustainable development for a river basin. This study adopted Variational Mode Decomposition (VMD) data-preprocessing technique to process and denoise the rainfall data before putting into the Support Vector Machine (SVM) streamflow forecasting model in order to improve the performance of the selected model. Rainfall data and river water level data for the period of 1996-2016 were used for this purpose. Homogeneity tests (Standard Normal Homogeneity Test, the Buishand Range Test, the Pettitt Test and the Von Neumann Ratio Test) and normality tests (Shapiro-Wilk Test, Anderson-Darling Test, Lilliefors Test and Jarque-Bera Test) had been carried out on the rainfall series. Homogenous and non-normally distributed data were found in all the stations, respectively. From the recorded rainfall data, it was observed that Dungun River Basin possessed higher monthly rainfall from November to February, which was during the Northeast Monsoon. Thus, the monthly and seasonal rainfall series of this monsoon would be the main focus for this research as floods usually happen during the Northeast Monsoon period. The predicted water levels from SVM model were assessed with the observed water level using non-parametric statistical tests (Biased Method, Kendall's Tau B Test and Spearman's Rho Test).

  6. An Application for Data Preprocessing and Models Extractions in Web Usage Mining

    Directory of Open Access Journals (Sweden)

    Claudia Elena DINUCA

    2011-11-01

    Full Text Available Web servers worldwide generate a vast amount of information on web users’ browsing activities. Several researchers have studied these so-called clickstream or web access log data to better understand and characterize web users. The goal of this application is to analyze user behaviour by mining enriched web access log data. With the continued growth and proliferation of e-commerce, Web services, and Web-based information systems, the volumes of click stream and user data collected by Web-based organizations in their daily operations has reached astronomical proportions. This information can be exploited in various ways, such as enhancing the effectiveness of websites or developing directed web marketing campaigns. The discovered patterns are usually represented as collections of pages, objects, or re-sources that are frequently accessed by groups of users with common needs or interests. In this paper we will focus on displaying the way how it was implemented the application for data preprocessing and extracting different data models from web logs data, finding association as a data mining technique to extract potentially useful knowledge from web usage data. We find different data models navigation patterns by analysing the log files of the web-site. I implemented the application in Java using NetBeans IDE. For exemplification, I used the log files data from a commercial web site www.nice-layouts.com.

  7. A comparative analysis of pre-processing techniques in colour retinal images

    Energy Technology Data Exchange (ETDEWEB)

    Salvatelli, A [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Bizai, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Barbosa, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Drozdowicz, B [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Delrieux, C [Electric and Computing Engineering Department, Universidad Nacional del Sur, Alem 1253, BahIa Blanca, (Partially funded by SECyT-UNS) (Argentina)], E-mail: claudio@acm.org

    2007-11-15

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising.

  8. Tools and databases of the KOMICS web portal for preprocessing, mining, and dissemination of metabolomics data.

    Science.gov (United States)

    Sakurai, Nozomu; Ara, Takeshi; Enomoto, Mitsuo; Motegi, Takeshi; Morishita, Yoshihiko; Kurabayashi, Atsushi; Iijima, Yoko; Ogata, Yoshiyuki; Nakajima, Daisuke; Suzuki, Hideyuki; Shibata, Daisuke

    2014-01-01

    A metabolome--the collection of comprehensive quantitative data on metabolites in an organism--has been increasingly utilized for applications such as data-intensive systems biology, disease diagnostics, biomarker discovery, and assessment of food quality. A considerable number of tools and databases have been developed to date for the analysis of data generated by various combinations of chromatography and mass spectrometry. We report here a web portal named KOMICS (The Kazusa Metabolomics Portal), where the tools and databases that we developed are available for free to academic users. KOMICS includes the tools and databases for preprocessing, mining, visualization, and publication of metabolomics data. Improvements in the annotation of unknown metabolites and dissemination of comprehensive metabolomic data are the primary aims behind the development of this portal. For this purpose, PowerGet and FragmentAlign include a manual curation function for the results of metabolite feature alignments. A metadata-specific wiki-based database, Metabolonote, functions as a hub of web resources related to the submitters' work. This feature is expected to increase citation of the submitters' work, thereby promoting data publication. As an example of the practical use of KOMICS, a workflow for a study on Jatropha curcas is presented. The tools and databases available at KOMICS should contribute to enhanced production, interpretation, and utilization of metabolomic Big Data.

  9. The PREP pipeline: standardized preprocessing for large-scale EEG analysis.

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode.

  10. Constant time distance queries in planar unweighted graphs with subquadratic preprocessing time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, C.

    2013-01-01

    Let G be an n-vertex planar, undirected, and unweighted graph. It was stated as open problems whether the Wiener index, defined as the sum of all-pairs shortest path distances, and the diameter of G can be computed in o(n(2)) time. We show that both problems can be solved in O(n(2) log log n/log n......) time with O(n) space. The techniques that we apply allow us to build, within the same time bound, an oracle for exact distance queries in G. More generally, for any parameter S is an element of [(log n/log log n)(2), n(2/5)], distance queries can be answered in O (root S log S/log n) time per query...... with O(n(2)/root S) preprocessing time and space requirement. With respect to running time, this is better than previous algorithms when log S = o(log n). All algorithms have linear space requirement. Our results generalize to a larger class of graphs including those with a fixed excluded minor. (C) 2012...

  11. Fast data preprocessing for chromatographic fingerprints of tomato cell wall polysaccharides using chemometric methods.

    Science.gov (United States)

    Quéméner, Bernard; Bertrand, Dominique; Marty, Isabelle; Causse, Mathilde; Lahaye, Marc

    2007-02-02

    The variability in the chemistry of cell wall polysaccharides in pericarp tissue of red-ripe tomato fruit (Solanum lycopersicon Mill.) was characterized by chemical methods and enzymatic degradations coupled to high performance anion exchange chromatography (HPAEC) and mass spectrometry analysis. Large fruited line, Levovil (LEV) carrying introgressed chromosome fragments from a cherry tomato line Cervil (CER) on chromosomes 4 (LC4), 9 (LC9), or on chromosomes 1, 2, 4 and 9 (LCX) and containing quantitative trait loci (QTLs) for texture traits, was studied. In order to differentiate cell wall polysaccharide modifications in the tomato fruit collection by multivariate analysis, chromatograms were corrected for baseline drift and shift of the component elution time using an approach derived from image analysis and mathematical morphology. The baseline was first corrected by using a "moving window" approach while the peak-matching method developed was based upon location of peaks as local maxima within a window of a definite size. The fast chromatographic data preprocessing proposed was a prerequisite for the different chemometric treatments, such as variance and principal component analysis applied herein to the analysis. Applied to the tomato collection, the combined enzymatic degradations and HPAEC analyses revealed that the firm LCX and CER genotypes showed a higher proportion of glucuronoxylans and pectic arabinan side chains while the mealy LC9 genotype demonstrated the highest content of pectic galactan side chains. QTLs on tomato chromosomes 1, 2, 4 and 9 contain important genes controlling glucuronoxylan and pectic neutral side chains biosynthesis and/or metabolism.

  12. Min st-cut oracle for planar graphs with near-linear preprocessing time

    DEFF Research Database (Denmark)

    Borradaile, Glencora; Sankowski, Piotr; Wulff-Nilsen, Christian

    2010-01-01

    For an undirected n-vertex planar graph G with non-negative edge-weights, we consider the following type of query: given two vertices s and t in G, what is the weight of a min st-cut in G? We show how to answer such queries in constant time with O(n log5 n) preprocessing time and O(n log n) space....... We use a Gomory-Hu tree to represent all the pairwise min st-cuts implicitly. Previously, no subquadratic time algorithm was known for this problem. Our oracle can be extended to report the min st-cuts in time proportional to their size. Since all-pairs min st-cut and the minimum cycle basis are dual...... problems in planar graphs, we also obtain an implicit representation of a minimum cycle basis in O(n log5 n) time and O(n log n) space and an explicit representation with additional O(C) time and space where C is the size of the basis. To obtain our results, we require that shortest paths be unique...

  13. Savings and Debts in Agriculture

    Directory of Open Access Journals (Sweden)

    Marina Luminita Sarbovan

    2012-05-01

    Full Text Available The savings and debts problematic bring us in front the Keynesian principles of supporting the global demand, so spectacular immortalized inside his “General Theory of Money. The architects of the European Union consider that production in agriculture and other economic branches is “ab initio” grounded on the credit mechanism administrated by banks: the present day approach of the agricultural process configured it as costly, owing a relatively medium to long term duration, and risky, making important the banking institution for mitigating such constrains. Romania fights for the ambitious goal of entering in the euro zone, and this target became even more challenging after the new EU Regulation No 1176/2011 on the prevention and correction of macroeconomic imbalances, which stipulates a safer surveillance for the member states. In fact, our country has to meet the exigencies of nominal and real convergence criteria, measured by the European scoreboard and relevant index.

  14. Moonlight project promotes energy-saving technology

    Science.gov (United States)

    Ishihara, A.

    1986-01-01

    In promoting energy saving, development of energy conservation technologies aimed at raising energy efficiency in the fields of energy conversion, its transportation, its storage, and its consumption is considered, along with enactment of legal actions urging rational use of energies and implementation of an enlightenment campaign for energy conservation to play a crucial role. Under the Moonlight Project, technical development is at present being centered around the following six pillars: (1) large scale energy saving technology; (2) pioneering and fundamental energy saving technology; (3) international cooperative research project; (4) research and survey of energy saving technology; (5) energy saving technology development by private industry; and (6) promotion of energy saving through standardization. Heat pumps, magnetohydrodynamic generators and fuel cells are discussed.

  15. Energy savings in Danish residential building stock

    DEFF Research Database (Denmark)

    Tommerup, Henrik M.; Svendsen, Svend

    2006-01-01

    a short account of the technical energy-saving possibilities that are present in existing dwellings and presents a financial methodology used for assessing energy-saving measures. In order to estimate the total savings potential detailed calculations have been performed in a case with two typical...... buildings representing the residential building stock and based on these calculations an assessment of the energy-saving potential is performed. A profitable savings potential of energy used for space heating of about 80% is identified over 45 years (until 2050) within the residential building stock......A large potential for energy savings exists in the Danish residential building stock due to the fact that 75% of the buildings were constructed before 1979 when the first important demands for energy performance of building were introduced. It is also a fact that many buildings in Denmark face...

  16. Saving in Sub-Saharan Africa

    OpenAIRE

    Ernest Aryeetey; Christopher Udry

    2000-01-01

    Gross domestic savings in Africa averaged only 8 percent of GDP in the 1980s, compared to 23 percent for Southeast Asia and 35 percent in the Newly Industrialized Economies. Aside from being generally low, saving rates in most of Africa have shown consistent decline over the last thirty years. These savings figures must be considered tentative, because they are derived as a residual in the national accounts from expenditure and production data that are themselves quite unreliable. Notwithstan...

  17. Image-preprocessing method for near-wall particle image velocimetry (PIV) image interrogation with very large in-plane displacement

    International Nuclear Information System (INIS)

    Zhu, Yiding; Yuan, Huijing; Zhang, Chuanhong; Lee, Cunbiao

    2013-01-01

    Accurate particle image velocimetry (PIV) measurements very near the wall are still a great challenge. The problem is compounded by the very large in-plane displacement on PIV images commonly encountered in measurements in hypersonic boundary layers. An improved image-preprocessing method is presented in this paper which expands the traditional window deformation iterative multigrid scheme to PIV images with very large displacement. Before the interrogation, stationary artificial particles of uniform size are added homogeneously in the wall region. The mean squares of the intensities of signals in the flow and in the wall region are postulated to be equal when half the initial interrogation window overlaps the wall region. The initial estimation near the wall is then smoothed by data from both sides of the shear layer to reduce the large random uncertainties. Interrogations in the following iterative steps then converge to the correct results to provide accurate predictions for particle tracking velocimetries. Significant improvement is seen in Monte Carlo simulations and experimental tests. The algorithm successfully extracted the small flow structures of the second-mode wave in the hypersonic boundary layer from PIV images with low signal-noise-ratios when the traditional method was not successful. (paper)

  18. Energy savings in CSFR - building sector

    International Nuclear Information System (INIS)

    Jacobsen, F.R.

    1993-01-01

    The Czechoslovak/Danish project on energy savings in buildings proves that it is possible to save up to 30% of the energy in buildings. 10% can be saved at an investment of 27 bill KCS. The total investment that is needed to save 30% is 140 bill KCS. Further energy savings can be obtained through more energy efficient supply systems. Information dissemination is important for the energy saving programme as are economic incentives. Investments in energy savings should be profitable for the investor, but this is not the case in the Czech and Slovak republics today. Changes are needed. Energy prices are still to low, compared to investment costs. Financial possibilities are not satisfactory for private investors. Price systems are not favourable to investment in energy savings. Training is needed for boiler men and energy consultants. Legislation is essential for the support of the full range of activities in the energy sector. Research and Development activities must back up the development of the sector. Pilot projects can illuminate the savings potential. The production of technical equipment for control and metering and production of insulation materials must be promoted. (AB)

  19. Cogeneration an opportunity for industrial energy saving

    International Nuclear Information System (INIS)

    Pasha, R.A.; Butt, Z.S.

    2011-01-01

    This paper is about the cogeneration from industrial energy savings opportunities perspective. The energy crisis in these days forces industry to find ways to cope with critical situation. There are several energy savings options which if properly planned and implemented would be beneficial both for industry and community. One way of energy saving is Cogeneration i.e. Combined Heat and Power. The paper will review the basic methods, types and then discuss the suitability of these options for specific industry. It has been identified that generally process industry can get benefits of energy savings. (author)

  20. lop-DWI: A Novel Scheme for Pre-Processing of Diffusion-Weighted Images in the Gradient Direction Domain.

    Science.gov (United States)

    Sepehrband, Farshid; Choupan, Jeiran; Caruyer, Emmanuel; Kurniawan, Nyoman D; Gal, Yaniv; Tieng, Quang M; McMahon, Katie L; Vegh, Viktor; Reutens, David C; Yang, Zhengyi

    2014-01-01

    We describe and evaluate a pre-processing method based on a periodic spiral sampling of diffusion-gradient directions for high angular resolution diffusion magnetic resonance imaging. Our pre-processing method incorporates prior knowledge about the acquired diffusion-weighted signal, facilitating noise reduction. Periodic spiral sampling of gradient direction encodings results in an acquired signal in each voxel that is pseudo-periodic with characteristics that allow separation of low-frequency signal from high frequency noise. Consequently, it enhances local reconstruction of the orientation distribution function used to define fiber tracks in the brain. Denoising with periodic spiral sampling was tested using synthetic data and in vivo human brain images. The level of improvement in signal-to-noise ratio and in the accuracy of local reconstruction of fiber tracks was significantly improved using our method.

  1. PRACTICAL RECOMMENDATIONS OF DATA PREPROCESSING AND GEOSPATIAL MEASURES FOR OPTIMIZING THE NEUROLOGICAL AND OTHER PEDIATRIC EMERGENCIES MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Ionela MANIU

    2017-08-01

    Full Text Available Time management, optimal and timed determination of emergency severity as well as optimizing the use of available human and material resources are crucial areas of emergency services. A starting point for achieving these optimizations can be considered the analysis and preprocess of real data from the emergency services. The benefits of performing this method consist in exposing more useful structures to data modelling algorithms which consequently will reduce overfitting and improves accuracy. This paper aims to offer practical recommendations for data preprocessing measures including feature selection and discretization of numeric attributes regarding age, duration of the case, season, period, week period (workday, weekend and geospatial location of neurological and other pediatric emergencies. An analytical, retrospective study was conducted on a sample consisting of 933 pediatric cases, from UPU-SMURD Sibiu, 01.01.2014 – 27.02.2017 period.

  2. Novel low-power ultrasound digital preprocessing architecture for wireless display.

    Science.gov (United States)

    Levesque, Philippe; Sawan, Mohamad

    2010-03-01

    A complete hardware-based ultrasound preprocessing unit (PPU) is presented as an alternative to available power-hungry devices. Intended to expand the ultrasonic applications, the proposed unit allows replacement of the cable of the ultrasonic probe by a wireless link to transfer data from the probe to a remote monitor. The digital back-end architecture of this PPU is fully pipelined, which permits sampling of ultrasonic signals at a frequency equal to the field-programmable gate array-based system clock, up to 100 MHz. Experimental results show that the proposed processing unit has an excellent performance, an equivalent 53.15 Dhrystone 2.1 MIPS/ MHz (DMIPS/MHz), compared with other software-based architectures that allow a maximum of 1.6 DMIPS/MHz. In addition, an adaptive subsampling method is proposed to operate the pixel compressor, which allows real-time image zooming and, by removing high-frequency noise, the lateral and axial resolutions are enhanced by 25% and 33%, respectively. Realtime images, acquired from a reference phantom, validated the feasibility of the proposed architecture. For a display rate of 15 frames per second, and a 5-MHz single-element piezoelectric transducer, the proposed digital PPU requires a dynamic power of only 242 mW, which represents around 20% of the best-available software-based system. Furthermore, composed by the ultrasound processor and the image interpolation unit, the digital processing core of the PPU presents good power-performance ratios of 26 DMIPS/mW and 43.9 DMIPS/mW at a 20-MHz and 100-MHz sample frequency, respectively.

  3. Hardware Design and Implementation of a Wavelet De-Noising Procedure for Medical Signal Preprocessing

    Directory of Open Access Journals (Sweden)

    Szi-Wen Chen

    2015-10-01

    Full Text Available In this paper, a discrete wavelet transform (DWT based de-noising with its applications into the noise reduction for medical signal preprocessing is introduced. This work focuses on the hardware realization of a real-time wavelet de-noising procedure. The proposed de-noising circuit mainly consists of three modules: a DWT, a thresholding, and an inverse DWT (IDWT modular circuits. We also proposed a novel adaptive thresholding scheme and incorporated it into our wavelet de-noising procedure. Performance was then evaluated on both the architectural designs of the software and. In addition, the de-noising circuit was also implemented by downloading the Verilog codes to a field programmable gate array (FPGA based platform so that its ability in noise reduction may be further validated in actual practice. Simulation experiment results produced by applying a set of simulated noise-contaminated electrocardiogram (ECG signals into the de-noising circuit showed that the circuit could not only desirably meet the requirement of real-time processing, but also achieve satisfactory performance for noise reduction, while the sharp features of the ECG signals can be well preserved. The proposed de-noising circuit was further synthesized using the Synopsys Design Compiler with an Artisan Taiwan Semiconductor Manufacturing Company (TSMC, Hsinchu, Taiwan 40 nm standard cell library. The integrated circuit (IC synthesis simulation results showed that the proposed design can achieve a clock frequency of 200 MHz and the power consumption was only 17.4 mW, when operated at 200 MHz.

  4. Hardware design and implementation of a wavelet de-noising procedure for medical signal preprocessing.

    Science.gov (United States)

    Chen, Szi-Wen; Chen, Yuan-Ho

    2015-10-16

    In this paper, a discrete wavelet transform (DWT) based de-noising with its applications into the noise reduction for medical signal preprocessing is introduced. This work focuses on the hardware realization of a real-time wavelet de-noising procedure. The proposed de-noising circuit mainly consists of three modules: a DWT, a thresholding, and an inverse DWT (IDWT) modular circuits. We also proposed a novel adaptive thresholding scheme and incorporated it into our wavelet de-noising procedure. Performance was then evaluated on both the architectural designs of the software and. In addition, the de-noising circuit was also implemented by downloading the Verilog codes to a field programmable gate array (FPGA) based platform so that its ability in noise reduction may be further validated in actual practice. Simulation experiment results produced by applying a set of simulated noise-contaminated electrocardiogram (ECG) signals into the de-noising circuit showed that the circuit could not only desirably meet the requirement of real-time processing, but also achieve satisfactory performance for noise reduction, while the sharp features of the ECG signals can be well preserved. The proposed de-noising circuit was further synthesized using the Synopsys Design Compiler with an Artisan Taiwan Semiconductor Manufacturing Company (TSMC, Hsinchu, Taiwan) 40 nm standard cell library. The integrated circuit (IC) synthesis simulation results showed that the proposed design can achieve a clock frequency of 200 MHz and the power consumption was only 17.4 mW, when operated at 200 MHz.

  5. Preprocessing of gravity gradients at the GOCE high-level processing facility

    Science.gov (United States)

    Bouman, Johannes; Rispens, Sietse; Gruber, Thomas; Koop, Radboud; Schrama, Ernst; Visser, Pieter; Tscherning, Carl Christian; Veicherts, Martin

    2009-07-01

    One of the products derived from the gravity field and steady-state ocean circulation explorer (GOCE) observations are the gravity gradients. These gravity gradients are provided in the gradiometer reference frame (GRF) and are calibrated in-flight using satellite shaking and star sensor data. To use these gravity gradients for application in Earth scienes and gravity field analysis, additional preprocessing needs to be done, including corrections for temporal gravity field signals to isolate the static gravity field part, screening for outliers, calibration by comparison with existing external gravity field information and error assessment. The temporal gravity gradient corrections consist of tidal and nontidal corrections. These are all generally below the gravity gradient error level, which is predicted to show a 1/ f behaviour for low frequencies. In the outlier detection, the 1/ f error is compensated for by subtracting a local median from the data, while the data error is assessed using the median absolute deviation. The local median acts as a high-pass filter and it is robust as is the median absolute deviation. Three different methods have been implemented for the calibration of the gravity gradients. All three methods use a high-pass filter to compensate for the 1/ f gravity gradient error. The baseline method uses state-of-the-art global gravity field models and the most accurate results are obtained if star sensor misalignments are estimated along with the calibration parameters. A second calibration method uses GOCE GPS data to estimate a low-degree gravity field model as well as gravity gradient scale factors. Both methods allow to estimate gravity gradient scale factors down to the 10-3 level. The third calibration method uses high accurate terrestrial gravity data in selected regions to validate the gravity gradient scale factors, focussing on the measurement band. Gravity gradient scale factors may be estimated down to the 10-2 level with this

  6. Integrated fMRI Preprocessing Framework Using Extended Kalman Filter for Estimation of Slice-Wise Motion

    OpenAIRE

    Basile Pinsard; Basile Pinsard; Basile Pinsard; Arnaud Boutin; Arnaud Boutin; Julien Doyon; Julien Doyon; Habib Benali; Habib Benali; Habib Benali

    2018-01-01

    Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit ...

  7. TargetSearch - a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data

    Science.gov (United States)

    2009-01-01

    Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393

  8. TargetSearch - a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data

    Directory of Open Access Journals (Sweden)

    Lisec Jan

    2009-12-01

    Full Text Available Abstract Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS. The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.

  9. TargetSearch--a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data.

    Science.gov (United States)

    Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A

    2009-12-16

    Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.

  10. Effects of different correlation metrics and preprocessing factors on small-world brain functional networks: a resting-state functional MRI study.

    Directory of Open Access Journals (Sweden)

    Xia Liang

    Full Text Available Graph theoretical analysis of brain networks based on resting-state functional MRI (R-fMRI has attracted a great deal of attention in recent years. These analyses often involve the selection of correlation metrics and specific preprocessing steps. However, the influence of these factors on the topological properties of functional brain networks has not been systematically examined. Here, we investigated the influences of correlation metric choice (Pearson's correlation versus partial correlation, global signal presence (regressed or not and frequency band selection [slow-5 (0.01-0.027 Hz versus slow-4 (0.027-0.073 Hz] on the topological properties of both binary and weighted brain networks derived from them, and we employed test-retest (TRT analyses for further guidance on how to choose the "best" network modeling strategy from the reliability perspective. Our results show significant differences in global network metrics associated with both correlation metrics and global signals. Analysis of nodal degree revealed differing hub distributions for brain networks derived from Pearson's correlation versus partial correlation. TRT analysis revealed that the reliability of both global and local topological properties are modulated by correlation metrics and the global signal, with the highest reliability observed for Pearson's-correlation-based brain networks without global signal removal (WOGR-PEAR. The nodal reliability exhibited a spatially heterogeneous distribution wherein regions in association and limbic/paralimbic cortices showed moderate TRT reliability in Pearson's-correlation-based brain networks. Moreover, we found that there were significant frequency-related differences in topological properties of WOGR-PEAR networks, and brain networks derived in the 0.027-0.073 Hz band exhibited greater reliability than those in the 0.01-0.027 Hz band. Taken together, our results provide direct evidence regarding the influences of correlation metrics

  11. Step by Step Microsoft Office Visio 2003

    CERN Document Server

    Lemke, Judy

    2004-01-01

    Experience learning made easy-and quickly teach yourself how to use Visio 2003, the Microsoft Office business and technical diagramming program. With STEP BY STEP, you can take just the lessons you need, or work from cover to cover. Either way, you drive the instruction-building and practicing the skills you need, just when you need them! Produce computer network diagrams, organization charts, floor plans, and moreUse templates to create new diagrams and drawings quicklyAdd text, color, and 1-D and 2-D shapesInsert graphics and pictures, such as company logosConnect shapes to create a basic f

  12. THE EFFECT OF DECOMPOSITION METHOD AS DATA PREPROCESSING ON NEURAL NETWORKS MODEL FOR FORECASTING TREND AND SEASONAL TIME SERIES

    Directory of Open Access Journals (Sweden)

    Subanar Subanar

    2006-01-01

    Full Text Available Recently, one of the central topics for the neural networks (NN community is the issue of data preprocessing on the use of NN. In this paper, we will investigate this topic particularly on the effect of Decomposition method as data processing and the use of NN for modeling effectively time series with both trend and seasonal patterns. Limited empirical studies on seasonal time series forecasting with neural networks show that some find neural networks are able to model seasonality directly and prior deseasonalization is not necessary, and others conclude just the opposite. In this research, we study particularly on the effectiveness of data preprocessing, including detrending and deseasonalization by applying Decomposition method on NN modeling and forecasting performance. We use two kinds of data, simulation and real data. Simulation data are examined on multiplicative of trend and seasonality patterns. The results are compared to those obtained from the classical time series model. Our result shows that a combination of detrending and deseasonalization by applying Decomposition method is the effective data preprocessing on the use of NN for forecasting trend and seasonal time series.

  13. Performance Comparison of Several Pre-Processing Methods in a Hand Gesture Recognition System based on Nearest Neighbor for Different Background Conditions

    Directory of Open Access Journals (Sweden)

    Iwan Setyawan

    2012-12-01

    Full Text Available This paper presents a performance analysis and comparison of several pre-processing methods used in a hand gesture recognition system. The pre-processing methods are based on the combinations of several image processing operations, namely edge detection, low pass filtering, histogram equalization, thresholding and desaturation. The hand gesture recognition system is designed to classify an input image into one of six possible classes. The input images are taken with various background conditions. Our experiments showed that the best result is achieved when the pre-processing method consists of only a desaturation operation, achieving a classification accuracy of up to 83.15%.

  14. Saving oil in a hurry

    Energy Technology Data Exchange (ETDEWEB)

    none

    2005-07-01

    During 2004, oil prices reached levels unprecedented in recent years. Though world oil markets remain adequately supplied, high oil prices do reflect increasingly uncertain conditions. Many IEA member countries and non-member countries alike are looking for ways to improve their capability to handle market volatility and possible supply disruptions in the future. This book aims to provide assistance. It provides a new, quantitative assessment of the potential oil savings and costs of rapid oil demand restraint measures for transport. Some measures may make sense under any circumstances; others are primarily useful in emergency situations. All can be implemented on short notice ? if governments are prepared. The book examines potential approaches for rapid uptake of telecommuting, ?ecodriving?, and car-pooling, among other measures. It also provides methodologies and data that policymakers can use to decide which measures would be best adapted to their national circumstances. This ?tool box? may help countries to complement other measures for coping with supply disruptions, such as use of strategic oil stocks.

  15. Energy-saving motor; Energiesparmotor

    Energy Technology Data Exchange (ETDEWEB)

    Lindegger, M.

    2002-07-01

    This report for the Swiss Federal Office of Energy (SFOE) describes the development and testing of an advanced electrical motor using a permanent-magnet rotor. The aims of the project - to study the technical feasibility and market potential of the Eco-Motor - are discussed and the three phases of the project described. These include the calculation and realisation of a 250-watt prototype operating at 230 V, the measurement of the motor's characteristics as well as those of a comparable asynchronous motor on the test bed at the University of Applied Science in Lucerne, Switzerland, and a market study to establish if the Eco-Motor and its controller can compete against normal asynchronous motors. Also, the results of an analysis of the energy-savings potential is made, should such Eco-Motors be used. Detailed results of the three phases of the project are presented and the prospects of producing such motors in Switzerland for home use as well as for export are examined.

  16. Enershield : energy saving air barriers

    Energy Technology Data Exchange (ETDEWEB)

    Hallihan, D. [Enershield Industries Ltd., Edmonton, AB (Canada)

    2008-07-01

    Enershield Industries is a leader in air barrier technology and provides solution for the Canadian climate. This presentation described the advantages of air barriers and the impact of rising energy costs. An air barrier is used to separate areas of differing environments and makes existing building systems more efficient. This presentation discussed how an air barrier works. It also identified how Enershield Industries calculates energy savings. It described air barrier applications and those who use barrier technology. These include the commercial and industrial sector as well as the personnel and retail sector. Barrier technology can be used for cold storage; vehicle and equipment washes; food processing; and environmental separation. Features and benefits such as the ability to create seal, acoustic insulation, and long term durability were also discussed. Last, the presentation addressed model selection and design criteria issues. Design criteria that were presented included a discussion of acoustic installation, articulating nozzles, scroll cased fans, and structural frame. Other design criteria presented were galvanized frames, telescopic sliders, and off the shelf parts. It was concluded that the ability to reduce energy consumption and enhance employee/client comfort is beneficial to the employer as well as to the employee. figs.

  17. Free Modal Algebras Revisited: The Step-by-Step Method

    NARCIS (Netherlands)

    Bezhanishvili, N.; Ghilardi, Silvio; Jibladze, Mamuka

    2012-01-01

    We review the step-by-step method of constructing finitely generated free modal algebras. First we discuss the global step-by-step method, which works well for rank one modal logics. Next we refine the global step-by-step method to obtain the local step-by-step method, which is applicable beyond

  18. Diabetes PSA (:30) Step By Step

    Centers for Disease Control (CDC) Podcasts

    2009-10-24

    First steps to preventing diabetes. For Hispanic and Latino American audiences.  Created: 10/24/2009 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 10/24/2009.

  19. Diabetes PSA (:60) Step By Step

    Centers for Disease Control (CDC) Podcasts

    2009-10-24

    First steps to preventing diabetes. For Hispanic and Latino American audiences.  Created: 10/24/2009 by National Diabetes Education Program (NDEP), a joint program of the Centers for Disease Control and Prevention and the National Institutes of Health.   Date Released: 10/24/2009.

  20. Concepts. Environmental care through energy saving

    Energy Technology Data Exchange (ETDEWEB)

    Wagner, G.

    1987-04-01

    Energy saving is an important ingredient of a preventive energy policy. It helps to reduce pollutants which are one essential source of damage done to air, water and soil. But even the environmentally damaging side effects of energy production, storage and distribution can be cut down through energy saving.

  1. Housing-related lifestyle and energy saving

    DEFF Research Database (Denmark)

    Thøgersen, John

    2017-01-01

    of relevant background characteristics. A multivariate GLM analysis reveals that when differences in housing-related lifestyles are controlled, neither country of residence nor the interaction between lifestyle and country of residence influence energy saving innovativeness or everyday energy-saving efforts...

  2. Mozambican Aggregate Consumption and Domestic Saving ...

    African Journals Online (AJOL)

    It was an unprecedented decade for its break with the previous trend; but so far, the new trend does not correspond to a substantial change in growth strategy to ensure that foreign savings become complementary rather than a substitute for domestic savings. Keywords: consumption, economic growth strategy, domestic ...

  3. 37 CFR 11.61 - Savings clause.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Savings clause. 11.61 Section... Disciplinary Proceedings; Jurisdiction, Sanctions, Investigations, and Proceedings § 11.61 Savings clause. (a... subsequent to such effective date, if such conduct would continue to justify suspension or exclusion under...

  4. Saving Energy. Managing School Facilities, Guide 3.

    Science.gov (United States)

    Department for Education and Employment, London (England). Architects and Building Branch.

    This guide offers information on how schools can implement an energy saving action plan to reduce their energy costs. Various low-cost energy-saving measures are recommended covering heating levels and heating systems, electricity demand reduction and lighting, ventilation, hot water usage, and swimming pool energy management. Additional…

  5. The High Cost of Saving Energy Dollars.

    Science.gov (United States)

    Rose, Patricia

    1985-01-01

    In alternative financing a private company provides the capital and expertise for improving school energy efficiency. Savings are split between the school system and the company. Options for municipal leasing, cost sharing, and shared savings are explained along with financial, procedural, and legal considerations. (MLF)

  6. 10 CFR 436.20 - Net savings.

    Science.gov (United States)

    2010-01-01

    ... ENERGY ENERGY CONSERVATION FEDERAL ENERGY MANAGEMENT AND PLANNING PROGRAMS Methodology and Procedures for Life Cycle Cost Analyses § 436.20 Net savings. For a retrofit project, net savings may be found by subtracting life cycle costs based on the proposed project from life cycle costs based on not having it. For a...

  7. Savings Behavior and Satisfaction with Savings: A Comparison of Low- and High-Income Groups.

    Science.gov (United States)

    Davis, Elizabeth P.; Schumm, Walter R.

    1987-01-01

    Data on 1,739 married couples from 13 states were analyzed. Associations between satisfaction with savings and level of savings with measures of motivation to save, motivations to spend, and family resources were found to differ substantially between low- and high-income couples. (Author/CH)

  8. Risk transfer via energy savings insurance

    Energy Technology Data Exchange (ETDEWEB)

    Mills, Evan

    2001-10-01

    Among the key barriers to investment in energy efficiency improvements are uncertainties about attaining projected energy savings and apprehension about potential disputes over these savings. The fields of energy management and risk management are thus intertwined. While many technical methods have emerged to manage performance risks (e.g. building commissioning), financial risk transfer techniques are less developed in the energy management arena than in other more mature segments of the economy. Energy Savings Insurance (ESI) - formal insurance of predicted energy savings - is one method of transferring financial risks away from the facility owner or energy services contractor. ESI offers a number of significant advantages over other forms of financial risk transfer, e.g. savings guarantees or performance bonds. ESI providers manage risk via pre-construction design review as well as post-construction commissioning and measurement and verification of savings. We found that the two mos t common criticisms of ESI - excessive pricing and onerous exclusions - are not born out in practice. In fact, if properly applied, ESI can potentially reduce the net cost of energy savings projects by reducing the interest rates charged by lenders, and by increasing the level of savings through quality control. Debt service can also be ensured by matching loan payments to projected energy savings while designing the insurance mechanism so that payments are made by the insurer in the event of a savings shortfall. We estimate the U.S. ESI market potential of $875 million/year in premium income. From an energy-policy perspective, ESI offers a number of potential benefits: ESI transfers performance risk from the balance sheet of the entity implementing the energy savings project, thereby freeing up capital otherwise needed to ''self-insure'' the savings. ESI reduces barriers to market entry of smaller energy services firms who do not have sufficiently strong balance

  9. Risk transfer via energy savings insurance; TOPICAL

    International Nuclear Information System (INIS)

    Mills, Evan

    2001-01-01

    Among the key barriers to investment in energy efficiency improvements are uncertainties about attaining projected energy savings and apprehension about potential disputes over these savings. The fields of energy management and risk management are thus intertwined. While many technical methods have emerged to manage performance risks (e.g. building commissioning), financial risk transfer techniques are less developed in the energy management arena than in other more mature segments of the economy. Energy Savings Insurance (ESI) - formal insurance of predicted energy savings - is one method of transferring financial risks away from the facility owner or energy services contractor. ESI offers a number of significant advantages over other forms of financial risk transfer, e.g. savings guarantees or performance bonds. ESI providers manage risk via pre-construction design review as well as post-construction commissioning and measurement and verification of savings. We found that the two mos t common criticisms of ESI - excessive pricing and onerous exclusions - are not born out in practice. In fact, if properly applied, ESI can potentially reduce the net cost of energy savings projects by reducing the interest rates charged by lenders, and by increasing the level of savings through quality control. Debt service can also be ensured by matching loan payments to projected energy savings while designing the insurance mechanism so that payments are made by the insurer in the event of a savings shortfall. We estimate the U.S. ESI market potential of$875 million/year in premium income. From an energy-policy perspective, ESI offers a number of potential benefits: ESI transfers performance risk from the balance sheet of the entity implementing the energy savings project, thereby freeing up capital otherwise needed to ''self-insure'' the savings. ESI reduces barriers to market entry of smaller energy services firms who do not have sufficiently strong balance sheets to self

  10. Saving electricity in a hurry - update 2011

    Energy Technology Data Exchange (ETDEWEB)

    Pasquier, Sara Bryan

    2011-06-15

    As demonstrated by the March 2011 earthquake and tsunami-triggered blackouts in Japan, electricity shortfalls can happen anytime and anywhere. Countries can minimise the negative economic, social and environmental impacts of such electricity shortfalls by developing emergency energy-saving strategies before a crisis occurs. This new IEA report highlights preliminary findings and conclusions from electricity shortfalls in Japan, the United States, New Zealand, South Africa and Chile. It draws on recent analysis to: reinforce well-established guidelines on diagnosing electricity shortfalls, identifying energy-saving opportunities and selecting a package of energy-saving measures; and highlight proven practice for implementing emergency energy-saving programmes. This paper will be valuable to government, academic, private-sector and civil-society stakeholders who inform, develop and implement electricity policy in general, and emergency energy-saving programmes in particular.

  11. ONU Power Saving Scheme for EPON System

    Science.gov (United States)

    Mukai, Hiroaki; Tano, Fumihiko; Tanaka, Masaki; Kozaki, Seiji; Yamanaka, Hideaki

    PON (Passive Optical Network) achieves FTTH (Fiber To The Home) economically, by sharing an optical fiber among plural subscribers. Recently, global climate change has been recognized as a serious near term problem. Power saving techniques for electronic devices are important. In PON system, the ONU (Optical Network Unit) power saving scheme has been studied and defined in XG-PON. In this paper, we propose an ONU power saving scheme for EPON. Then, we present an analysis of the power reduction effect and the data transmission delay caused by the ONU power saving scheme. According to the analysis, we propose an efficient provisioning method for the ONU power saving scheme which is applicable to both of XG-PON and EPON.

  12. Values and Technologies in Energy Savings

    DEFF Research Database (Denmark)

    Nørgård, Jørgen Stig

    2000-01-01

    of this saving can cause what is called the rebound effect, which reduces the savings obtained from the technology. Ways to avoid this effect are suggested, and they require value changes, primarly around frugality, consumption, and hard-working. There are indications that some of the necessary changes are well......The chapter is based on the assumption, that technology improvement is not sufficient to achieve a sustainable world community. Changes in people´s values are necessary. A simple model suggest how values, together with basic needs and with the environmental and societal frames, determine people......´s behavioural pattern and lifestyles. Deliberate changes in social values are illustrated by a historical example. From the side of technology the basic principles in the economy of energy savings are briefly described. The marginally profitable energy savings provides an economic saving. The application...

  13. Breakthrough Energy Savings with Waterjet Technology

    Energy Technology Data Exchange (ETDEWEB)

    Lee W. Saperstein; R. Larry Grayson; David A. Summers; Jorge Garcia-Joo; Greg Sutton; Mike Woodward; T.P. McNulty

    2007-05-15

    Experiments performed at the University of Missouri-Rolla's Waterjet Laboratory have demonstrated clearly the ability of waterjets to disaggregate, in a single step, four different mineral ores, including ores containing iron, lead and copper products. The study focused mainly on galena-bearing dolomite, a lead ore, and compared the new technology with that of traditional mining and milling to liberate the valuable constituent for the more voluminous host rock. The technical term for the disintegration of the ore to achieve this liberation is comminution. The potential for energy savings if this process can be improved, is immense. Further, if this separation can be made at the mining face, then the potential energy savings include avoidance of transportation (haulage and hoisting) costs to move, process and store this waste at the surface. The waste can, instead, be disposed into the available cavities within the mine. The savings also include the elimination of the comminution, crushing and grinding, stages in the processing plant. Future prototype developments are intended to determine if high-pressure waterjet mining and processing can be optimized to become cheaper than traditional fragmentation by drilling and blasting and to optimize the separation process. The basic new mining process was illustrated in tests on two local rock types, a low-strength sandstone with hematite inclusions, and a medium to high-strength dolomite commonly used for construction materials. Illustrative testing of liberation of minerals, utilized a lead-bearing dolomite, and included a parametric study of the optimal conditions needed to create a size distribution considered best for separation. The target goal was to have 50 percent of the mined material finer than 100 mesh (149 microns). Of the 21 tests that were run, five clearly achieved the target. The samples were obtained as run-of-mine lumps of ore, which exhibited a great deal of heterogeneity within the samples. This

  14. Can this merger be saved?

    Science.gov (United States)

    Cliffe, S

    1999-01-01

    In this fictional case study, a merger that looked like a marriage made in heaven to those at corporate headquarters is feeling like an infernal union to those on the ground. The merger is between Synergon Capital, a U.S. financial-services behemoth, and Beauchamp, Becker & Company, a venerable British financial-services company with strong profits and an extraordinarily loyal client base of wealthy individuals. Beauchamp also boasts a strong group of senior managers led by Julian Mansfield, a highly cultured and beloved patriarch who personifies all that's good about the company. Synergon isn't accustomed to acquiring such companies. It usually encircles a poorly managed turnaround candidate and then, once the deal is done, drops a neutron bomb on it, leaving file cabinets and contracts but no people. Before acquiring Beauchamp, Synergon's macho men offered loud assurances that they would leave the tradition-bound company alone-provided, of course, that Beauchamp met the ambitious target numbers and showed sufficient enthusiasm for cross-selling Synergon's products to its wealthy clients. In charge of making the acquisition work is Nick Cunningham, one of Synergon's more thoughtful executives. Nick, who was against the deal from the start, is the face and voice of Synergon for Julian Mansfield. And Mansfield, in his restrained way, is angry at the constant flow of bureaucratic forms, at the rude demands for instant information, at the peremptory changes. He's even dropping broad hints at retirement. Nick has already been warned: if Mansfield goes, you go. Six commentators advise Nick on how to save his job by bringing peace and prosperity to the feuding couple.

  15. A Probabilistic Approach to Network Event Formation from Pre-Processed Waveform Data

    Science.gov (United States)

    Kohl, B. C.; Given, J.

    2017-12-01

    The current state of the art for seismic event detection still largely depends on signal detection at individual sensor stations, including picking accurate arrivals times and correctly identifying phases, and relying on fusion algorithms to associate individual signal detections to form event hypotheses. But increasing computational capability has enabled progress toward the objective of fully utilizing body-wave recordings in an integrated manner to detect events without the necessity of previously recorded ground truth events. In 2011-2012 Leidos (then SAIC) operated a seismic network to monitor activity associated with geothermal field operations in western Nevada. We developed a new association approach for detecting and quantifying events by probabilistically combining pre-processed waveform data to deal with noisy data and clutter at local distance ranges. The ProbDet algorithm maps continuous waveform data into continuous conditional probability traces using a source model (e.g. Brune earthquake or Mueller-Murphy explosion) to map frequency content and an attenuation model to map amplitudes. Event detection and classification is accomplished by combining the conditional probabilities from the entire network using a Bayesian formulation. This approach was successful in producing a high-Pd, low-Pfa automated bulletin for a local network and preliminary tests with regional and teleseismic data show that it has promise for global seismic and nuclear monitoring applications. The approach highlights several features that we believe are essential to achieving low-threshold automated event detection: Minimizes the utilization of individual seismic phase detections - in traditional techniques, errors in signal detection, timing, feature measurement and initial phase ID compound and propagate into errors in event formation, Has a formalized framework that utilizes information from non-detecting stations, Has a formalized framework that utilizes source information, in

  16. PreP+07: improvements of a user friendly tool to preprocess and analyse microarray data

    Directory of Open Access Journals (Sweden)

    Claros M Gonzalo

    2009-01-01

    Full Text Available Abstract Background Nowadays, microarray gene expression analysis is a widely used technology that scientists handle but whose final interpretation usually requires the participation of a specialist. The need for this participation is due to the requirement of some background in statistics that most users lack or have a very vague notion of. Moreover, programming skills could also be essential to analyse these data. An interactive, easy to use application seems therefore necessary to help researchers to extract full information from data and analyse them in a simple, powerful and confident way. Results PreP+07 is a standalone Windows XP application that presents a friendly interface for spot filtration, inter- and intra-slide normalization, duplicate resolution, dye-swapping, error removal and statistical analyses. Additionally, it contains two unique implementation of the procedures – double scan and Supervised Lowess-, a complete set of graphical representations – MA plot, RG plot, QQ plot, PP plot, PN plot – and can deal with many data formats, such as tabulated text, GenePix GPR and ArrayPRO. PreP+07 performance has been compared with the equivalent functions in Bioconductor using a tomato chip with 13056 spots. The number of differentially expressed genes considering p-values coming from the PreP+07 and Bioconductor Limma packages were statistically identical when the data set was only normalized; however, a slight variability was appreciated when the data was both normalized and scaled. Conclusion PreP+07 implementation provides a high degree of freedom in selecting and organizing a small set of widely used data processing protocols, and can handle many data formats. Its reliability has been proven so that a laboratory researcher can afford a statistical pre-processing of his/her microarray results and obtain a list of differentially expressed genes using PreP+07 without any programming skills. All of this gives support to scientists

  17. The Python Spectral Analysis Tool (PySAT) for Powerful, Flexible, and Easy Preprocessing and Machine Learning with Point Spectral Data

    Science.gov (United States)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T.; Morris, R. V.; Laura, J.

    2018-04-01

    The PySAT point spectra tool provides a flexible graphical interface, enabling scientists to apply a wide variety of preprocessing and machine learning methods to point spectral data, with an emphasis on multivariate regression.

  18. Microsoft Office Word 2007 step by step

    CERN Document Server

    Cox, Joyce

    2007-01-01

    Experience learning made easy-and quickly teach yourself how to create impressive documents with Word 2007. With Step By Step, you set the pace-building and practicing the skills you need, just when you need them!Apply styles and themes to your document for a polished lookAdd graphics and text effects-and see a live previewOrganize information with new SmartArt diagrams and chartsInsert references, footnotes, indexes, a table of contentsSend documents for review and manage revisionsTurn your ideas into blogs, Web pages, and moreYour all-in-one learning experience includes:Files for building sk

  19. Saving can save from death anxiety: mortality salience and financial decision-making.

    Science.gov (United States)

    Zaleskiewicz, Tomasz; Gasiorowska, Agata; Kesebir, Pelin

    2013-01-01

    Four studies tested the idea that saving money can buffer death anxiety and constitute a more effective buffer than spending money. Saving can relieve future-related anxiety and provide people with a sense of control over their fate, thereby rendering death thoughts less threatening. Study 1 found that participants primed with both saving and spending reported lower death fear than controls. Saving primes, however, were associated with significantly lower death fear than spending primes. Study 2 demonstrated that mortality primes increase the attractiveness of more frugal behaviors in save-or-spend dilemmas. Studies 3 and 4 found, in two different cultures (Polish and American), that the activation of death thoughts prompts people to allocate money to saving as opposed to spending. Overall, these studies provided evidence that saving protects from existential anxiety, and probably more so than spending.

  20. Saving can save from death anxiety: mortality salience and financial decision-making.

    Directory of Open Access Journals (Sweden)

    Tomasz Zaleskiewicz

    Full Text Available Four studies tested the idea that saving money can buffer death anxiety and constitute a more effective buffer than spending money. Saving can relieve future-related anxiety and provide people with a sense of control over their fate, thereby rendering death thoughts less threatening. Study 1 found that participants primed with both saving and spending reported lower death fear than controls. Saving primes, however, were associated with significantly lower death fear than spending primes. Study 2 demonstrated that mortality primes increase the attractiveness of more frugal behaviors in save-or-spend dilemmas. Studies 3 and 4 found, in two different cultures (Polish and American, that the activation of death thoughts prompts people to allocate money to saving as opposed to spending. Overall, these studies provided evidence that saving protects from existential anxiety, and probably more so than spending.

  1. Electric energy savings from new technologies

    Energy Technology Data Exchange (ETDEWEB)

    Moe, R.J.; Harrer, B.J.; Kellogg, M.A.; Lyke, A.J.; Imhoff, K.L.; Fisher, Z.J.

    1986-01-01

    Purpose of the report is to provide information about the electricity-saving potential of new technologies to OCEP that it can use in developing alternative long-term projections of US electricity consumption. Low-, base-, and high-case scenarios of the electricity savings for ten technologies were prepared. The total projected annual savings for the year 2000 for all ten technologies were 137 billion kilowatt hours (BkWh), 279 BkWh, and 470 BkWh, respectively, for the three cases. The magnitude of these savings projections can be gauged by comparing them to the Department's reference case projection for the 1985 National Energy Policy Plan. In the Department's reference case, total consumption in 2000 is projected to be 3319 BkWh. Thus, the savings projected here represent between 4% and 14% of total consumption projected for 2000. Because approximately 75% of the base-case estimate of savings are already incorporated into the reference forecast, reducing projected electricity consumption from what it otherwise would have been, the savings estimated here should not be directly subtracted from the reference forecast.

  2. Energy conservation. Federal shared energy savings contracting

    International Nuclear Information System (INIS)

    Fultz, Keith O.; Milans, Flora H.; Kirk, Roy J.; Welker, Robert A.; Sparling, William J.; Butler, Sharon E.; Irwin, Susan W.

    1989-04-01

    A number of impediments have discouraged federal agencies from using shared energy savings contracts. As of November 30, 1988, only two federal agencies - the U.S. Postal Service (USPS) and the Department of the Army -had awarded such contracts even though they can yield significant energy and cost savings. The three major impediments we identified were uncertainty about the applicability of a particular procurement policy and practice, lack of management incentives, and difficulty in measuring energy and cost savings. To address the first impediment, the Department of Energy (DOE) developed a manual on shared energy savings contracting. The second impediment was addressed when the 100th Congress authorized incentives for federal agencies to enter into shared savings contracts. DOE addressed the third impediment by developing a methodology for calculating energy consumption and cost savings. However, because of differing methodological preferences, this issue will need to be addressed on a contract-by-contract basis. Some state governments and private sector firms are using performance contracts to reduce energy costs in their buildings and facilities. We were able to identify six states that were using performance contracts. Five have established programs, and all six states have projects under contract. The seven energy service companies we contacted indicated interest in federal shared energy savings contracting

  3. Survey of industrial radioisotope savings

    International Nuclear Information System (INIS)

    1965-01-01

    Only three decades after the discovery of artificial radioactivity and two after radioisotopes became available in quantity, methods employing these as sources or tracers have found widespread use, not only in scientific research, but also in industrial process and product control. The sums spent by industry on these new techniques amount to millions of dollars a year. Realizing the overall attitude of industry to scientific progress - to accept only methods that pay relatively quickly - one can assume that the economic benefits must be of a still larger order of magnitude. In order to determine the extent to which radioisotopes are in daily use and to evaluate the economic benefits derived from such use, IAEA decided to make an 'International Survey on the Use of Radioisotopes in Industry'. In 1962, the Agency invited a number of its highly industrialized Member States to participate in this Survey. Similar surveys had been performed in various countries in the 1950's. However, the approaches and also the definition of the economic benefits differed greatly from one survey to another. Hence, the Agency's approach was to try to persuade all countries to conduct surveys at the same time, concerning the same categories of industries and using the same terms of costs, savings, etc. In total, 24 Member States of the Agency agreed to participate in the survey and in due course they submitted contributions. The national reports were discussed at a 'Study Group Meeting on Radioisotope Economics', convened in Vienna in March 1964. Based upon these discussions, the national reports have been edited and summarized. A publication showing the administration of the Survey and providing all details is now published by the Agency. From the publication it is evident that in general the return of technical information was quite high, of the order of 90%, but, unfortunately the economic response was much lower. However, most of the reports had some bearing on the economic aspects

  4. Status of pre-processing of waste electrical and electronic equipment in Germany and its influence on the recovery of gold.

    Science.gov (United States)

    Chancerel, Perrine; Bolland, Til; Rotter, Vera Susanne

    2011-03-01

    Waste electrical and electronic equipment (WEEE) contains gold in low but from an environmental and economic point of view relevant concentration. After collection, WEEE is pre-processed in order to generate appropriate material fractions that are sent to the subsequent end-processing stages (recovery, reuse or disposal). The goal of this research is to quantify the overall recovery rates of pre-processing technologies used in Germany for the reference year 2007. To achieve this goal, facilities operating in Germany were listed and classified according to the technology they apply. Information on their processing capacity was gathered by evaluating statistical databases. Based on a literature review of experimental results for gold recovery rates of different pre-processing technologies, the German overall recovery rate of gold at the pre-processing level was quantified depending on the characteristics of the treated WEEE. The results reveal that - depending on the equipment groups - pre-processing recovery rates of gold of 29 to 61% are achieved in Germany. Some practical recommendations to reduce the losses during pre-processing could be formulated. Defining mass-based recovery targets in the legislation does not set incentives to recover trace elements. Instead, the priorities for recycling could be defined based on other parameters like the environmental impacts of the materials. The implementation of measures to reduce the gold losses would also improve the recovery of several other non-ferrous metals like tin, nickel, and palladium.

  5. Freshwater savings from marine protein consumption

    International Nuclear Information System (INIS)

    Gephart, Jessica A; Pace, Michael L; D’Odorico, Paolo

    2014-01-01

    Marine fisheries provide an essential source of protein for many people around the world. Unlike alternative terrestrial sources of protein, marine fish production requires little to no freshwater inputs. Consuming marine fish protein instead of terrestrial protein therefore represents freshwater savings (equivalent to an avoided water cost) and contributes to a low water footprint diet. These water savings are realized by the producers of alternative protein sources, rather than the consumers of marine protein. This study quantifies freshwater savings from marine fish consumption around the world by estimating the water footprint of replacing marine fish with terrestrial protein based on current consumption patterns. An estimated 7 600 km 3  yr −1 of water is used for human food production. Replacing marine protein with terrestrial protein would require an additional 350 km 3  yr −1 of water, meaning that marine protein provides current water savings of 4.6%. The importance of these freshwater savings is highly uneven around the globe, with savings ranging from as little as 0 to as much as 50%. The largest savings as a per cent of current water footprints occur in Asia, Oceania, and several coastal African nations. The greatest national water savings from marine fish protein occur in Southeast Asia and the United States. As the human population increases, future water savings from marine fish consumption will be increasingly important to food and water security and depend on sustainable harvest of capture fisheries and low water footprint growth of marine aquaculture. (paper)

  6. Principles of valuing business travel time savings

    OpenAIRE

    Fowkes, A.S.

    2001-01-01

    OVERVIEW\\ud \\ud There are two approaches to valuing travel time savings to business people. The first is that which has formed the basis of UK policy for about 30 years, and which is set out in Section 2. This takes the value of travel time savings on employer’s business as equal to the gross wage rate plus an allowance for other costs that the employer saves. These might include such things as desk space, computer, tools, uniform, protective clothing, travel expenses. These were investigated...

  7. Who's in the business of saving lives?

    Science.gov (United States)

    Lee Chang, Pepe

    2006-10-01

    There are individuals, including children, dying needlessly in poverty-stricken third world countries. Many of these deaths could be prevented if pharmaceutical companies provided the drugs needed to save their lives. Some believe that because pharmaceutical companies have the power to save lives, and because they can do so with little effort, they have a special obligation. I argue that there is no distinction, with respect to obligations and responsibilities, between pharmaceutical companies and other types of companies. As a result, to hold pharmaceutical companies especially responsible for saving lives in third world countries is unjustified.

  8. Household water saving: Evidence from Spain

    Science.gov (United States)

    Aisa, Rosa; Larramona, Gemma

    2012-12-01

    This article focuses on household water use in Spain by analyzing the influence of a detailed set of factors. We find that, although the presence of both water-saving equipment and water-conservation habits leads to water savings, the factors that influence each are not the same. In particular, our results show that those individuals most committed to the adoption of water-saving equipment and, at the same time, less committed to water-conservation habits tend to have higher incomes.

  9. Must losing taxes on saving be harmful?

    DEFF Research Database (Denmark)

    Huizinga, Harry; Nielsen, Søren Bo

    2004-01-01

    on account of international tax evasion mayprevent the overall saving-investment tax wedge from becoming too high, and hencemay be beneficial for moderate preferences for public goods. A world with 'high-spending' governments, in contrast, is made worse off by the loss of saving taxes,and hence stands...... are financed by taxes on savingand investment. There is international cross-ownership of firms, and countries areassumed to be unable to tax away pure profits. Countries then face an incentiveto impose a rather high investment tax also borne by foreigners. In this setting,the loss of the saving tax instrument...

  10. 12 CFR 583.20 - Savings and loan holding company.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Savings and loan holding company. 583.20... REGULATIONS AFFECTING SAVINGS AND LOAN HOLDING COMPANIES § 583.20 Savings and loan holding company. The term savings and loan holding company means any company that directly or indirectly controls a savings...

  11. Small Town Energy Program (STEP) Final Report revised

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, Charles (Chuck) T.

    2014-01-02

    University Park, Maryland (“UP”) is a small town of 2,540 residents, 919 homes, 2 churches, 1 school, 1 town hall, and 1 breakthrough community energy efficiency initiative: the Small Town Energy Program (“STEP”). STEP was developed with a mission to “create a model community energy transformation program that serves as a roadmap for other small towns across the U.S.” STEP first launched in January 2011 in UP and expanded in July 2012 to the neighboring communities of Hyattsville, Riverdale Park, and College Heights Estates, MD. STEP, which concluded in July 2013, was generously supported by a grant from the U.S. Department of Energy (DOE). The STEP model was designed for replication in other resource-constrained small towns similar to University Park - a sector largely neglected to date in federal and state energy efficiency programs. STEP provided a full suite of activities for replication, including: energy audits and retrofits for residential buildings, financial incentives, a community-based social marketing backbone and local community delivery partners. STEP also included the highly innovative use of an “Energy Coach” who worked one-on-one with clients throughout the program. Please see www.smalltownenergy.org for more information. In less than three years, STEP achieved the following results in University Park: • 30% of community households participated voluntarily in STEP; • 25% of homes received a Home Performance with ENERGY STAR assessment; • 16% of households made energy efficiency improvements to their home; • 64% of households proceeded with an upgrade after their assessment; • 9 Full Time Equivalent jobs were created or retained, and 39 contractors worked on STEP over the course of the project. Estimated Energy Savings - Program Totals kWh Electricity 204,407 Therms Natural Gas 24,800 Gallons of Oil 2,581 Total Estimated MMBTU Saved (Source Energy) 5,474 Total Estimated Annual Energy Cost Savings $61,343 STEP clients who

  12. Estimating customer electricity savings from projects installed by the U.S. ESCO industry

    Energy Technology Data Exchange (ETDEWEB)

    Carvallo, Juan Pablo [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Larsen, Peter H. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States); Goldman, Charles A. [Lawrence Berkeley National Laboratory (LBNL), Berkeley, CA (United States)

    2014-11-25

    The U.S. energy service company (ESCO) industry has a well-established track record of delivering substantial energy and dollar savings in the public and institutional facilities sector, typically through the use of energy savings performance contracts (ESPC) (Larsen et al. 2012; Goldman et al. 2005; Hopper et al. 2005, Stuart et al. 2013). This ~$6.4 billion industry, which is expected to grow significantly over the next five years, may play an important role in achieving demand-side energy efficiency under local/state/federal environmental policy goals. To date, there has been little or no research in the public domain to estimate electricity savings for the entire U.S. ESCO industry. Estimating these savings levels is a foundational step in order to determine total avoided greenhouse gas (GHG) emissions from demand-side energy efficiency measures installed by U.S. ESCOs. We introduce a method to estimate the total amount of electricity saved by projects implemented by the U.S. ESCO industry using the Lawrence Berkeley National Laboratory (LBNL) /National Association of Energy Service Companies (NAESCO) database of projects and LBNL’s biennial industry survey. We report two metrics: incremental electricity savings and savings from ESCO projects that are active in a given year (e.g., 2012). Overall, we estimate that in 2012 active U.S. ESCO industry projects generated about 34 TWh of electricity savings—15 TWh of these electricity savings were for MUSH market customers who did not rely on utility customer-funded energy efficiency programs (see Figure 1). This analysis shows that almost two-thirds of 2012 electricity savings in municipal, local and state government facilities, universities/colleges, K-12 schools, and healthcare facilities (i.e., the so-called “MUSH” market) were not supported by a utility customer-funded energy efficiency program.

  13. Current breathomics-a review on data pre-processing techniques and machine learning in metabolomics breath analysis

    DEFF Research Database (Denmark)

    Smolinska, A.; Hauschild, A. C.; Fijten, R. R. R.

    2014-01-01

    been extensively developed. Yet, the application of machine learning methods for fingerprinting VOC profiles in the breathomics is still in its infancy. Therefore, in this paper, we describe the current state of the art in data pre-processing and multivariate analysis of breathomics data. We start...... different conditions (e.g. disease stage, treatment). Independently of the utilized analytical method, the most important question, 'which VOCs are discriminatory?', remains the same. Answers can be given by several modern machine learning techniques (multivariate statistics) and, therefore, are the focus...

  14. Predicting prices of agricultural commodities in Thailand using combined approach emphasizing on data pre-processing technique

    Directory of Open Access Journals (Sweden)

    Thoranin Sujjaviriyasup

    2018-02-01

    Full Text Available In this research, a combined approach emphasizing on data pre-processing technique is developed to forecast prices of agricultural commodities in Thailand. The future prices play significant role in decision making to cultivate crops in next year. The proposed model takes ability of MODWT to decompose original time series data into more stable and explicit subseries, and SVR model to formulate complex function of forecasting. The experimental results indicated that the proposed model outperforms traditional forecasting models based on MAE and MAPE criteria. Furthermore, the proposed model reveals that it is able to be a useful forecasting tool for prices of agricultural commodities in Thailand

  15. User behaviour impact on energy savings potential

    DEFF Research Database (Denmark)

    Rose, Jørgen

    2014-01-01

    and the residents' behaviour and if these defaults do not reflect actual circumstances, it can result in non-realisation of expected energy savings. Furthermore, a risk also exists that residents' behaviour change after the energy upgrading, e.g. to obtain improved comfort than what was possible before......, 3) Domestic hot water consumption and 4) Air change rate. Based on the analysis, a methodology is established that can be used to make more realistic and accurate predictions of expected energy savings associated with energy upgrading taking into account user behaviour....... the upgrading and this could lead to further discrepancies between the calculated and the actual energy savings. This paper presents an analysis on how residents’ behaviour and the use of standard assumptions may influence expected energy savings. The analysis is performed on two typical single-family houses...

  16. Maintaining a Viable Energy Savings Performance Contract

    National Research Council Canada - National Science Library

    Weber, Katherine L; Huckeby, Michael A

    2005-01-01

    Substantial amounts of information are available on Energy Savings Performance Contract award requirements, measurement, and verification, but we have found very little information on the day-to-day...

  17. Potential Logistics Cost Savings from Engine Commonality

    National Research Council Canada - National Science Library

    Henderson, Robert L; Higer, Matthew W

    2007-01-01

    The purpose of this MBA Project is to determine potential logistics cost savings the USAF and DoD could have realized through the life of the F-16 fighter aircraft had they required engine commonality...

  18. Radioisotope savings in industry and agriculture

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1964-01-15

    Benefits and savings achieved in industry and agriculture were described by leading experts from six different countries at a public discussion organized by the Agency on 24 September 1963, during the last IAEA General Conference.

  19. Site preparation savings through better utilization standards

    Science.gov (United States)

    W.F. Watson; B.J. Stokes; I.W. Savelle

    1984-01-01

    This reports preliminary paper results of a study to determine the savings in the cost of site preparation that can be accomplished by the intensive utiliiation of understory biomass. mechanized sys terns can potentially be used for recovering this material.

  20. Caustic saving potentile in textile processing mills

    International Nuclear Information System (INIS)

    Latif, M.; Rehman, A.; Ghafar, A.; Hafeez, N.M.

    2010-01-01

    The textile processing industry of pakistan has great potential of improvement in resource consumption in various production processes. One major concern is the heavy usage of caustic soda (sodium hydroxide) especially during the mercerization process which incurs a significant cost to a textile processing mill. To reduce the unit fabric production cost and stay competitive, the industry need to minimize the caustic wastage and explore the caustic saving potential. This paper describe the detailed caustic consumption practices and saving potentials in woven textile sector based on the data base of 100 industries. Region wise caustic saving potential is also investigated . Three caustic conservation option including process improvement, reuse and recycling, and caustic recovery plants are discussed. Detailed technical and and financial requirements. saving potentials and paybacks of these options are provided. (author)

  1. Glaucoma: Screening Can Save Your Sight!

    Science.gov (United States)

    ... of this page please turn Javascript on. Feature: Glaucoma Glaucoma: Screening Can Save Your Sight! Past Issues / Fall 2009 Table of Contents People with glaucoma see the world through a tunnel. Glaucoma is ...

  2. Statistical Uncertainty in the Medicare Shared Savings...

    Data.gov (United States)

    U.S. Department of Health & Human Services — According to analysis reported in Statistical Uncertainty in the Medicare Shared Savings Program published in Volume 2, Issue 4 of the Medicare and Medicaid Research...

  3. The School Advanced Ventilation Engineering Software (SAVES)

    Science.gov (United States)

    The School Advanced Ventilation Engineering Software (SAVES) package is a tool to help school designers assess the potential financial payback and indoor humidity control benefits of Energy Recovery Ventilation (ERV) systems for school applications.

  4. 31 CFR 359.35 - May I purchase definitive Series I savings bonds through a payroll savings plan?

    Science.gov (United States)

    2010-07-01

    ... savings bonds through a payroll savings plan? 359.35 Section 359.35 Money and Finance: Treasury... May I purchase definitive Series I savings bonds through a payroll savings plan? You may purchase definitive bonds through deductions from your pay if your employer maintains a payroll savings plan. An...

  5. 31 CFR 351.47 - May I purchase definitive Series EE savings bonds through a payroll savings plan?

    Science.gov (United States)

    2010-07-01

    ... savings bonds through a payroll savings plan? 351.47 Section 351.47 Money and Finance: Treasury....47 May I purchase definitive Series EE savings bonds through a payroll savings plan? You may purchase... maintains a payroll savings plan. An authorized issuing agent must issue the bonds. ...

  6. The value of business travel time savings

    OpenAIRE

    Fowkes, A.S.; Marks, P.; Nash, C.A.

    1986-01-01

    The value of time savings for business travellers forms a sizeable part of the benefits from trunk road, rail and air transport improvement schemes. It is therefore important to possess appropiate values to place on business travel time savings for evaluation purposes. The normal approach in practice is to adopt the wage rate of the workers in question plus an increment for overheads and non-wage payments. \\ud \\ud In this paper criticisms of this approach are discssed and the implications of ...

  7. Potential energy savings and thermal comfort

    DEFF Research Database (Denmark)

    Jensen, Karsten Ingerslev; Rudbeck, Claus Christian; Schultz, Jørgen Munthe

    1996-01-01

    The simulation results on the energy saving potential and influence on indoor thermal comfort by replacement of common windows with aerogel windows as well as commercial low-energy windows are described and analysed.......The simulation results on the energy saving potential and influence on indoor thermal comfort by replacement of common windows with aerogel windows as well as commercial low-energy windows are described and analysed....

  8. Risk transfer via energy savings insurance

    OpenAIRE

    Mills, Evan

    2001-01-01

    Among the key barriers to investment in energy efficiency improvements are uncertainties about attaining projected energy savings and apprehension about potential disputes over these savings. The fields of energy management and risk management are thus intertwined. While many technical methods have emerged to manage performance risks (e.g. building commissioning), financial risk transfer techniques are less developed in the energy management arena than in other more mature segments of t...

  9. Individual savings accounts for social insurance

    DEFF Research Database (Denmark)

    Bovenberg, Lans; Hansen, Martin Ino; Sørensen, Peter Birch

    2008-01-01

    Using Danish data, we find that about three-fourths of the taxes levied to finance public transfers actually finance benefits that redistribute income over the life cycle of individual taxpayers rather than redistribute resources across people. This finding and similar results for other countries...... provide a rationale for financing part of social insurance via mandatory individual savings accounts. We discuss the advantages and disadvantages of mandatory individual savings accounts for social insurance and survey some recent alternative proposals for such accounts...

  10. Creation of Carbon Credits by Water Saving

    Directory of Open Access Journals (Sweden)

    Yasutoshi Shimizu

    2012-07-01

    Full Text Available Until now, as a way of reducing greenhouse gas emissions from Japanese homes, the emphasis has been on reduction of energy consumption for air-conditioning and lighting. In recent years, there has been progress in CO2 emission reduction through research into the water-saving performance of bathroom fixtures such as toilets and showers. Simulations have shown that CO2 emissions associated with water consumption in Japanese homes can be reduced by 25% (1% of Japan’s total CO2 emissions by 2020 through the adoption of the use of water-saving fixtures. In response to this finding, a program to promote the replacement of current fixtures with water-saving toilet bowls and thermally insulated bathtubs has been added to the Government of Japan’s energy-saving policy. Furthermore, CO2 emission reduction through widespread use of water-saving fixtures has been adopted by the domestic credit system promoted by the Government of Japan as a way of achieving CO2 emission-reduction targets; application of this credit system has also begun. As part of a bilateral offset credit mechanism promoted by the Government of Japan, research to evaluate the CO2 reduction potential of the adoption of water-saving fixtures has been done in the city of Dalian, in China.

  11. Refrigeration: Introducing energy saving opportunities for business

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-04-15

    In some industries, most notably food and drink and chemicals, refrigeration accounts for a significant proportion of overall site energy costs. For instance, in the industrial handling of meat, poultry and fish, it often accounts for 50% of total energy costs. In ice-cream production the proportion is 70%. In a number of commercial sectors, refrigeration also represents a significant proportion of overall energy costs. For example: Cold storage 90%; Food supermarkets 50%; Small shops with refrigerated cabinets 70% or over; Pubs and clubs 30%. Against these high costs, even a small reduction in refrigeration energy use can offer significant cost savings, resulting in increased profits. Energy saving need not be expensive. Energy savings of up to 20% can be realised in many refrigeration plant through actions that require little or no investment. In addition, improving the efficiency and reducing the load on a refrigeration plant can improve reliability and reduce the likelihood of a breakdown. Most organisations can save energy and money on refrigeration by: More efficient equipment; Good maintenance; Housekeeping and control. This publication provides an understanding of the operation of refrigeration systems, identifies where savings can be realised and will enable readers to present an informed case on energy savings to key decision makers within their organisation. (GB)

  12. RECRUITMENT FINANCED BY SAVED LEAVE (RSL PROGRAMME)

    CERN Multimedia

    Division du Personnel; Tel. 73903

    1999-01-01

    Transfer to the saved leave account and saved leave bonusStaff members participating in the RSL programme may opt to transfer up to 10 days of unused annual leave or unused compensatory leave into their saved leave account, at the end of the leave year, i.e. 30 September (as set out in the implementation procedure dated 27 August 1997).A leave transfer request form, which you should complete, sign and return, if you wish to use this possibility, has been addressed you. To allow the necessary time for the processing of your request, you should return it without delay.As foreseen in the implementation procedure, an additional day of saved leave will be granted for each full period of 20 days remaining in the saved leave account on 31 December 1999, for any staff member participating in the RSL programme until that date.For part-time staff members participating in the RSL programme, the above-mentioned days of leave (annual, compensatory and saved) are adjusted proportionally to their contractual working week as...

  13. Saving-enhanced memory: the benefits of saving on the learning and remembering of new information.

    Science.gov (United States)

    Storm, Benjamin C; Stone, Sean M

    2015-02-01

    With the continued integration of technology into people's lives, saving digital information has become an everyday facet of human behavior. In the present research, we examined the consequences of saving certain information on the ability to learn and remember other information. Results from three experiments showed that saving one file before studying a new file significantly improved memory for the contents of the new file. Notably, this effect was not observed when the saving process was deemed unreliable or when the contents of the to-be-saved file were not substantial enough to interfere with memory for the new file. These results suggest that saving provides a means to strategically off-load memory onto the environment in order to reduce the extent to which currently unneeded to-be-remembered information interferes with the learning and remembering of other information. © The Author(s) 2014.

  14. Focal cryotherapy: step by step technique description

    Directory of Open Access Journals (Sweden)

    Cristina Redondo

    Full Text Available ABSTRACT Introduction and objective: Focal cryotherapy emerged as an efficient option to treat favorable and localized prostate cancer (PCa. The purpose of this video is to describe the procedure step by step. Materials and methods: We present the case of a 68 year-old man with localized PCa in the anterior aspect of the prostate. Results: The procedure is performed under general anesthesia, with the patient in lithotomy position. Briefly, the equipment utilized includes the cryotherapy console coupled with an ultrasound system, argon and helium gas bottles, cryoprobes, temperature probes and an urethral warming catheter. The procedure starts with a real-time trans-rectal prostate ultrasound, which is used to outline the prostate, the urethra and the rectal wall. The cryoprobes are pretested and placed in to the prostate through the perineum, following a grid template, along with the temperature sensors under ultrasound guidance. A cystoscopy confirms the right positioning of the needles and the urethral warming catheter is installed. Thereafter, the freeze sequence with argon gas is started, achieving extremely low temperatures (-40°C to induce tumor cell lysis. Sequentially, the thawing cycle is performed using helium gas. This process is repeated one time. Results among several series showed a biochemical disease-free survival between 71-93% at 9-70 month- follow-up, incontinence rates between 0-3.6% and erectile dysfunction between 0-42% (1–5. Conclusions: Focal cryotherapy is a feasible procedure to treat anterior PCa that may offer minimal morbidity, allowing good cancer control and better functional outcomes when compared to whole-gland treatment.

  15. Quality assessment of baby food made of different pre-processed organic raw materials under industrial processing conditions.

    Science.gov (United States)

    Seidel, Kathrin; Kahl, Johannes; Paoletti, Flavio; Birlouez, Ines; Busscher, Nicolaas; Kretzschmar, Ursula; Särkkä-Tirkkonen, Marjo; Seljåsen, Randi; Sinesio, Fiorella; Torp, Torfinn; Baiamonte, Irene

    2015-02-01

    The market for processed food is rapidly growing. The industry needs methods for "processing with care" leading to high quality products in order to meet consumers' expectations. Processing influences the quality of the finished product through various factors. In carrot baby food, these are the raw material, the pre-processing and storage treatments as well as the processing conditions. In this study, a quality assessment was performed on baby food made from different pre-processed raw materials. The experiments were carried out under industrial conditions using fresh, frozen and stored organic carrots as raw material. Statistically significant differences were found for sensory attributes among the three autoclaved puree samples (e.g. overall odour F = 90.72, p processed from frozen carrots show increased moisture content and decrease of several chemical constituents. Biocrystallization identified changes between replications of the cooking. Pre-treatment of raw material has a significant influence on the final quality of the baby food.

  16. An efficient depth map preprocessing method based on structure-aided domain transform smoothing for 3D view generation.

    Directory of Open Access Journals (Sweden)

    Wei Liu

    Full Text Available Depth image-based rendering (DIBR, which is used to render virtual views with a color image and the corresponding depth map, is one of the key techniques in the 2D to 3D conversion process. Due to the absence of knowledge about the 3D structure of a scene and its corresponding texture, DIBR in the 2D to 3D conversion process, inevitably leads to holes in the resulting 3D image as a result of newly-exposed areas. In this paper, we proposed a structure-aided depth map preprocessing framework in the transformed domain, which is inspired by recently proposed domain transform for its low complexity and high efficiency. Firstly, our framework integrates hybrid constraints including scene structure, edge consistency and visual saliency information in the transformed domain to improve the performance of depth map preprocess in an implicit way. Then, adaptive smooth localization is cooperated and realized in the proposed framework to further reduce over-smoothness and enhance optimization in the non-hole regions. Different from the other similar methods, the proposed method can simultaneously achieve the effects of hole filling, edge correction and local smoothing for typical depth maps in a united framework. Thanks to these advantages, it can yield visually satisfactory results with less computational complexity for high quality 2D to 3D conversion. Numerical experimental results demonstrate the excellent performances of the proposed method.

  17. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-03-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  18. Web Log Pre-processing and Analysis for Generation of Learning Profiles in Adaptive E-learning

    Directory of Open Access Journals (Sweden)

    Radhika M. Pai

    2016-04-01

    Full Text Available Adaptive E-learning Systems (AESs enhance the efficiency of online courses in education by providing personalized contents and user interfaces that changes according to learner’s requirements and usage patterns. This paper presents the approach to generate learning profile of each learner which helps to identify the learning styles and provide Adaptive User Interface which includes adaptive learning components and learning material. The proposed method analyzes the captured web usage data to identify the learning profile of the learners. The learning profiles are identified by an algorithmic approach that is based on the frequency of accessing the materials and the time spent on the various learning components on the portal. The captured log data is pre-processed and converted into standard XML format to generate learners sequence data corresponding to the different sessions and time spent. The learning style model adopted in this approach is Felder-Silverman Learning Style Model (FSLSM. This paper also presents the analysis of learner’s activities, preprocessed XML files and generated sequences.

  19. Research of high speed data readout and pre-processing system based on xTCA for silicon pixel detector

    International Nuclear Information System (INIS)

    Zhao Jingzhou; Lin Haichuan; Guo Fang; Liu Zhen'an; Xu Hao; Gong Wenxuan; Liu Zhao

    2012-01-01

    As the development of the detector, Silicon pixel detectors have been widely used in high energy physics experiments. It needs data processing system with high speed, high bandwidth and high availability to read data from silicon pixel detectors which generate more large data. The same question occurs on Belle II Pixel Detector which is a new style silicon pixel detector used in SuperKEKB accelerator with high luminance. The paper describes the research of High speed data readout and pre-processing system based on xTCA for silicon pixel detector. The system consists of High Performance Computer Node (HPCN) based on xTCA and ATCA frame. The HPCN consists of 4XFPs based on AMC, 1 AMC Carrier ATCA Board (ACAB) and 1 Rear Transmission Module. It characterized by 5 high performance FPGAs, 16 fiber links based on RocketIO, 5 Gbit Ethernet ports and DDR2 with capacity up to 18GB. In a ATCA frame, 14 HPCNs make up a system using the high speed backplane to achieve the function of data pre-processing and trigger. This system will be used on the trigger and data acquisition system of Belle II Pixel detector. (authors)

  20. Integrated fMRI Preprocessing Framework Using Extended Kalman Filter for Estimation of Slice-Wise Motion

    Directory of Open Access Journals (Sweden)

    Basile Pinsard

    2018-04-01

    Full Text Available Functional MRI acquisition is sensitive to subjects' motion that cannot be fully constrained. Therefore, signal corrections have to be applied a posteriori in order to mitigate the complex interactions between changing tissue localization and magnetic fields, gradients and readouts. To circumvent current preprocessing strategies limitations, we developed an integrated method that correct motion and spatial low-frequency intensity fluctuations at the level of each slice in order to better fit the acquisition processes. The registration of single or multiple simultaneously acquired slices is achieved online by an Iterated Extended Kalman Filter, favoring the robust estimation of continuous motion, while an intensity bias field is non-parametrically fitted. The proposed extraction of gray-matter BOLD activity from the acquisition space to an anatomical group template space, taking into account distortions, better preserves fine-scale patterns of activity. Importantly, the proposed unified framework generalizes to high-resolution multi-slice techniques. When tested on simulated and real data the latter shows a reduction of motion explained variance and signal variability when compared to the conventional preprocessing approach. These improvements provide more stable patterns of activity, facilitating investigation of cerebral information representation in healthy and/or clinical populations where motion is known to impact fine-scale data.

  1. Combined data preprocessing and multivariate statistical analysis characterizes fed-batch culture of mouse hybridoma cells for rational medium design.

    Science.gov (United States)

    Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup

    2010-10-01

    We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.

  2. Exploring the time-saving bias: How drivers misestimate time saved when increasing speed

    Directory of Open Access Journals (Sweden)

    Eyal Peer

    2010-12-01

    Full Text Available According to the time-saving bias, drivers underestimate the time saved when increasing from a low speed and overestimate the time saved when increasing from a relatively high speed. Previous research used a specific type of task --- drivers were asked to estimate time saved when increasing speed and to give a numeric response --- to show this. The present research conducted two studies with multiple questions to show that the time-saving bias occurs in other tasks. Study 1 found that drivers committed the time-saving bias when asked to estimate (a the time saved when increasing speed or (b the distance that can be completed at a given time when increasing speed or (c the speed required to complete a given distance in decreasing times. Study 2 showed no major differences in estimations of time saved compared to estimations of the remaining journey time and also between responses given on a numeric scale versus a visual analog scale. Study 3 tested two possible explanations for the time-saving bias: a Proportion heuristic and a Differences heuristic. Some evidence was found for use of the latter.

  3. Step-by-step cyclic processes scheduling

    DEFF Research Database (Denmark)

    Bocewicz, G.; Nielsen, Izabela Ewa; Banaszak, Z.

    2013-01-01

    Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research is to pro......Automated Guided Vehicles (AGVs) fleet scheduling is one of the big problems in Flexible Manufacturing System (FMS) control. The problem is more complicated when concurrent multi-product manufacturing and resource deadlock avoidance policies are considered. The objective of the research...... is to provide a declarative model enabling to state a constraint satisfaction problem aimed at AGVs fleet scheduling subject to assumed itineraries of concurrently manufactured product types. In other words, assuming a given layout of FMS’s material handling and production routes of simultaneously manufactured...... orders, the main objective is to provide the declarative framework aimed at conditions allowing one to calculate the AGVs fleet schedule in online mode. An illustrative example of the relevant algebra-like driven step-by-stem cyclic scheduling is provided....

  4. Application of artificial neural networks for versatile preprocessing of electrocardiogram recordings.

    Science.gov (United States)

    Mateo, J; Rieta, J J

    2012-02-01

    The electrocardiogram (ECG) is the most widely used method for diagnosis of heart diseases, where a good quality of recordings allows the proper interpretation and identification of physiological and pathological phenomena. However, ECG recordings often have interference from noises including thermal, muscle, baseline and powerline noises. These signals severely limit ECG recording utility and, hence, have to be removed. To deal with this problem, the present paper proposes an artificial neural network (ANN) as a filter to remove all kinds of noise in just one step. The method is based on a growing ANN which optimizes both the number of nodes in the hidden layer and the coefficient matrices, which are optimized by means of the Widrow-Hoff delta algorithm. The ANN has been trained with a database comprising all kinds of noise, both from synthesized and real ECG recordings, in order to handle any noise signal present in the ECG. The proposed system improves results yielded by conventional techniques of ECG filtering, such as FIR-based systems, adaptive filtering and wavelet filtering. Therefore, the algorithm could serve as an effective framework to substantially reduce noise in ECG recordings. In addition, the resulting ECG signal distortion is notably more reduced in comparison with conventional methodologies. In summary, the current contribution introduces a new method which is able to suppress all ECG interference signals in only one step with low ECG distortion and a high noise reduction.

  5. Energy Savings from Industrial Water Reductions

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Prakash; McKane, Aimee; de Fontaine, Andre

    2015-08-03

    Although it is widely recognized that reducing freshwater consumption is of critical importance, generating interest in industrial water reduction programs can be hindered for a variety of reasons. These include the low cost of water, greater focus on water use in other sectors such as the agriculture and residential sectors, high levels of unbilled and/or unregulated self-supplied water use in industry, and lack of water metering and tracking capabilities at industrial facilities. However, there are many additional components to the resource savings associated with reducing site water use beyond the water savings alone, such as reductions in energy consumption, greenhouse gas emissions, treatment chemicals, and impact on the local watershed. Understanding and quantifying these additional resource savings can expand the community of businesses, NGOs, government agencies, and researchers with a vested interest in water reduction. This paper will develop a methodology for evaluating the embedded energy consumption associated with water use at an industrial facility. The methodology developed will use available data and references to evaluate the energy consumption associated with water supply and wastewater treatment outside of a facility’s fence line for various water sources. It will also include a framework for evaluating the energy consumption associated with water use within a facility’s fence line. The methodology will develop a more complete picture of the total resource savings associated with water reduction efforts and allow industrial water reduction programs to assess the energy and CO2 savings associated with their efforts.

  6. THE PUZZLE OF SIMULTANEOUS SAVINGS AND DEBTS

    Directory of Open Access Journals (Sweden)

    RODICA IANOLE

    2012-05-01

    Full Text Available „Neither a borrower nor a lender be” recommends Shakespeare in Hamlet. The advice seems particularly interesting in nowadays society where a person can be easily found in both approximate situations, in the same time. It goes without saying that saving and borrowing do not describe mutually exclusive strategies of financial management and thus many people retain savings or carry on saving at the same time as having debts. We add to this fact a more pragmatically wisdom, the one of the economist Robert Solow -“We (economists think of wealth as fungible; we think a dollar is a dollar. Why don't they (the others do so?” (Solow, 1987 – and we naturally ask ourselves if the mechanism of having simultaneous savings and debts is a rational one, according to traditional economics.Making appeal to the emerging body of behavioral economics literature we reach to the mental accounting theory to see if it can explain savings inclination versus debt inclination. The main research question we want to explore is the following: if mental accounting prevents people from spending money from one „mental account” on goods belonging to another one, will people – after using all their money from a given account – be willing to go into debt to buy goods belonging to this account in a situation when they still have money in other accounts?

  7. Short-Term Saved Leave Scheme

    CERN Multimedia

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new implementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme a...

  8. Short-Term Saved Leave Scheme

    CERN Multimedia

    HR Department

    2007-01-01

    As announced at the meeting of the Standing Concertation Committee (SCC) on 26 June 2007 and in http://Bulletin No. 28/2007, the existing Saved Leave Scheme will be discontinued as of 31 December 2007. Staff participating in the Scheme will shortly receive a contract amendment stipulating the end of financial contributions compensated by save leave. Leave already accumulated on saved leave accounts can continue to be taken in accordance with the rules applicable to the current scheme. A new system of saved leave will enter into force on 1 January 2008 and will be the subject of a new im-plementation procedure entitled "Short-term saved leave scheme" dated 1 January 2008. At its meeting on 4 December 2007, the SCC agreed to recommend the Director-General to approve this procedure, which can be consulted on the HR Department’s website at the following address: https://cern.ch/hr-services/services-Ben/sls_shortterm.asp All staff wishing to participate in the new scheme ...

  9. Feeding your piggy bank with intentions : A study on saving behaviour, saving strategies, and happiness

    NARCIS (Netherlands)

    De Francisco Vela, S.; Desmet, P.M.A.; Casais, M.

    2014-01-01

    The act of saving money can connect one’s present state to a meaningful future state, especially if we consider money not as a direct source of happiness, but as a resource for engaging in meaningful activities. To explore how design can contribute to making the act of saving more meaningful, we

  10. Saving for Success: Financial Education and Savings Goal Achievement in Individual Development Accounts

    Science.gov (United States)

    Grinstead, Mary L.; Mauldin, Teresa; Sabia, Joseph J.; Koonce, Joan; Palmer, Lance

    2011-01-01

    Using microdata from the American Dream Demonstration, the current study examines factors associated with savings and savings goal achievement (indicated by a matched withdrawal) among participants of individual development account (IDA) programs. Multinomial logit results show that hours of participation in financial education programs, higher…

  11. Public-opinion poll on energy saving

    International Nuclear Information System (INIS)

    1982-01-01

    A public-opinion poll was carried out on energy saving from November 26 to December 2, 1981, across the country. The number of persons participated in the survey was 5,000, whose age was 20 and above. The recovery ratio was 4,007 persons (80.1 %). The results of the survey and also the question-answer form are given with respective percentages. The questions were in the following three categories: (1) cognizance of energy saving - space-heating temperature, energy saving conscience use of private cars, purchase of highenergy consumption appliances; (2) energy for future - energy consumption, energy consumption trend, new types of energy, main sources of power generation, nuclear power in the overall electric power, apprehension toward nuclear power plants, safety measures in nuclear power plants; (3) governmental energy policy measures. (J.P.N.)

  12. An energy saving system for hospital laundries

    Energy Technology Data Exchange (ETDEWEB)

    Katsanis, J.S.; Tsarabaris, P.T.; Polykrati, A.D.; Proios, A.N. [National Technical Univ. of Athens, Athens (Greece). School of Electrical and Computer Engineering; Koufakis, E.I. [Public Power Corp. S.A., Crete (Greece)

    2009-07-01

    Hospital laundries are one of the largest consumers of water and electrical and thermal energy. This paper examined the energy savings achieved by a system using the hot wastewater from the washing process. Hospital laundries consume thermal energy using steam, which is produced in boilers by burning diesel oil or natural gas. Electrical energy for the mechanical drives, ventilation and also the lighting required in the laundry area are big consumers of energy. The paper presented the proposed system and discussed the parameters of the system and system dimensioning. The paper also provided and discussed an interpretation of steam and energy savings. The proposed system was considered to be economically viable, simple in its construction, installation and operation. From the application of the suggested system, the cost savings resulted in a satisfactory payback period for the capital invested of approximately three to five years. 14 refs., 4 tabs., 2 figs.

  13. Marine sediment sample pre-processing for macroinvertebrates metabarcoding: mechanical enrichment and homogenization

    Directory of Open Access Journals (Sweden)

    Eva Aylagas

    2016-10-01

    Full Text Available Metabarcoding is an accurate and cost-effective technique that allows for simultaneous taxonomic identification of multiple environmental samples. Application of this technique to marine benthic macroinvertebrate biodiversity assessment for biomonitoring purposes requires standardization of laboratory and data analysis procedures. In this context, protocols for creation and sequencing of amplicon libraries and their related bioinformatics analysis have been recently published. However, a standardized protocol describing all previous steps (i.e. processing and manipulation of environmental samples for macroinvertebrate community characterization is lacking. Here, we provide detailed procedures for benthic environmental sample collection, processing, enrichment for macroinvertebrates, homogenization, and subsequent DNA extraction for metabarcoding analysis. Since this is the first protocol of this kind, it should be of use to any researcher in this field, having the potential for improvement.

  14. Speech perception for adult cochlear implant recipients in a realistic background noise: effectiveness of preprocessing strategies and external options for improving speech recognition in noise.

    Science.gov (United States)

    Gifford, René H; Revit, Lawrence J

    2010-01-01

    Although cochlear implant patients are achieving increasingly higher levels of performance, speech perception in noise continues to be problematic. The newest generations of implant speech processors are equipped with preprocessing and/or external accessories that are purported to improve listening in noise. Most speech perception measures in the clinical setting, however, do not provide a close approximation to real-world listening environments. To assess speech perception for adult cochlear implant recipients in the presence of a realistic restaurant simulation generated by an eight-loudspeaker (R-SPACE) array in order to determine whether commercially available preprocessing strategies and/or external accessories yield improved sentence recognition in noise. Single-subject, repeated-measures design with two groups of participants: Advanced Bionics and Cochlear Corporation recipients. Thirty-four subjects, ranging in age from 18 to 90 yr (mean 54.5 yr), participated in this prospective study. Fourteen subjects were Advanced Bionics recipients, and 20 subjects were Cochlear Corporation recipients. Speech reception thresholds (SRTs) in semidiffuse restaurant noise originating from an eight-loudspeaker array were assessed with the subjects' preferred listening programs as well as with the addition of either Beam preprocessing (Cochlear Corporation) or the T-Mic accessory option (Advanced Bionics). In Experiment 1, adaptive SRTs with the Hearing in Noise Test sentences were obtained for all 34 subjects. For Cochlear Corporation recipients, SRTs were obtained with their preferred everyday listening program as well as with the addition of Focus preprocessing. For Advanced Bionics recipients, SRTs were obtained with the integrated behind-the-ear (BTE) mic as well as with the T-Mic. Statistical analysis using a repeated-measures analysis of variance (ANOVA) evaluated the effects of the preprocessing strategy or external accessory in reducing the SRT in noise. In addition

  15. Economic Energy Savings Potential in Federal Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Daryl R.; Dirks, James A.; Hunt, Diane M.

    2000-09-04

    The primary objective of this study was to estimate the current life-cycle cost-effective (i.e., economic) energy savings potential in Federal buildings and the corresponding capital investment required to achieve these savings, with Federal financing. Estimates were developed for major categories of energy efficiency measures such as building envelope, heating system, cooling system, and lighting. The analysis was based on conditions (building stock and characteristics, retrofit technologies, interest rates, energy prices, etc.) existing in the late 1990s. The potential impact of changes to any of these factors in the future was not considered.

  16. Savings in its sights for Somerset Trust.

    Science.gov (United States)

    Russell, Colin

    2011-10-01

    Colin Russell, healthcare specialist at Schneider Electric (pictured), explains how the company has recently worked with Taunton and Somerset NHS Foundation Trust to implement a major energy-saving project at the Trust's Musgrove Park Hospital in Taunton. He argues that, at a time when all areas of the service are being asked to reduce costs, such partnerships can potentially save the institution millions of pounds and significantly reduce carbon emissions, while "revitalising" parts of the NHS estate, and ensuring continuity of vital hospital services for facilities managers.

  17. Nuclear recycling: costs, savings, and safeguards

    International Nuclear Information System (INIS)

    Spinrad, B.I.

    1985-01-01

    This chapter discusses the economics, physical and chemical processes, and safety of nuclear fuel recycling. The spent fuel must be chemically reprocessed in order to recover uranium and plutonium. Topics considered include indifference costs, recycling in light water reactors (LWRs), plutonium in fast reactors, the choice between recycling and storage, safeguards, and weapons proliferation. It is shown that the economics of recycling nuclear fuel involves the actual costs and savings of the recycling operation in terms of money spent, made, and saved, and the impact of the recycling on the future cost of uranium

  18. Raising household saving: does financial education work?

    Science.gov (United States)

    Gale, William G; Harris, Benjamin H; Levine, Ruth

    2012-01-01

    This article highlights the prevalence and economic outcomes of financial illiteracy among American households, and reviews previous research that examines how improving financial literacy affects household saving. Analysis of the research literature suggests that previous financial literacy efforts have yielded mixed results. Evidence suggests that interventions provided for employees in the workplace have helped increase household saving, but estimates of the magnitude of the impact vary widely. For financial education initiatives targeted to other groups, the evidence is much more ambiguous, suggesting a need for more econometrically rigorous evaluations.

  19. Financing aspects of electricity saving's in Brazil

    International Nuclear Information System (INIS)

    Pacca, S.A.; Sauer, I.L.

    1996-01-01

    Programs regarding to energy saving in Brasil arised by the early 80's from the concern with oil products consumption. The situation has changed and electricity is growing in importance when energy conservation is discussed. Thus, the government, following examples from abroad, has been trying to overcome the barriers that obstruct a massive process evolution towards energy saving, since the potential is spread out in many society segments. One important issue is related to financing energy conservation. This paper attempts to analyze the utilities efforts allied to public and private banks investments towards energy conservation financing. (author)

  20. The Seven Step Strategy

    Science.gov (United States)

    Schaffer, Connie

    2017-01-01

    Many well-intended instructors use Socratic or leveled questioning to facilitate the discussion of an assigned reading. While this engages a few students, most can opt to remain silent. The seven step strategy described in this article provides an alternative to classroom silence and engages all students. Students discuss a single reading as they…

  1. Saving water to save the environment: Contrasting the effectiveness of environmental and monetary appeals in a residential water saving intervention

    NARCIS (Netherlands)

    Tijs, M.S.; Karremans, J.C.T.M.; Veling, H.P.; Lange, M.A. de; Meegeren, P. van; Lion, R.

    2017-01-01

    To convince people to reduce their energy consumption, two types of persuasive appeals often are used by environmental organizations: Monetary appeals (i.e., 'conserving energy will save you money') and environmental appeals (i.e., 'conserving energy will protect the environment'). In this field

  2. Taxation and the household saving rate: evidence from OECD countries

    Directory of Open Access Journals (Sweden)

    Vito Tanzi

    2000-03-01

    Full Text Available This paper analyzes anew the relationship between taxation and the household saving rate. On the basis of standard savings and tax revenue data from a sample of OECD countries, it provides compelling empirical evidence of a powerful impact of taxes on household savings. In particular, income taxes are shown to affect negatively the household saving rate much more than consumption taxes.

  3. 12 CFR 541.18 - Interim Federal savings association.

    Science.gov (United States)

    2010-01-01

    ... an existing savings and loan holding company or to facilitate any other transaction the Office may... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Interim Federal savings association. 541.18... REGULATIONS AFFECTING FEDERAL SAVINGS ASSOCIATIONS § 541.18 Interim Federal savings association. The term...

  4. 49 CFR 173.219 - Life-saving appliances.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 2 2010-10-01 2010-10-01 false Life-saving appliances. 173.219 Section 173.219... Life-saving appliances. (a) A life-saving appliance, self-inflating or non-self-inflating, containing small quantities of hazardous materials that are required as part of the life-saving appliance must...

  5. A practical review of energy saving technology for ageing populations.

    Science.gov (United States)

    Walker, Guy; Taylor, Andrea; Whittet, Craig; Lynn, Craig; Docherty, Catherine; Stephen, Bruce; Owens, Edward; Galloway, Stuart

    2017-07-01

    Fuel poverty is a critical issue for a globally ageing population. Longer heating/cooling requirements combine with declining incomes to create a problem in need of urgent attention. One solution is to deploy technology to help elderly users feel informed about their energy use, and empowered to take steps to make it more cost effective and efficient. This study subjects a broad cross section of energy monitoring and home automation products to a formal ergonomic analysis. A high level task analysis was used to guide a product walk through, and a toolkit approach was used thereafter to drive out further insights. The findings reveal a number of serious usability issues which prevent these products from successfully accessing an important target demographic and associated energy saving and fuel poverty outcomes. Design principles and examples are distilled from the research to enable practitioners to translate the underlying research into high quality design-engineering solutions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Pre-processing of input files for the AZTRAN code; Pre procesamiento de archivos de entrada para el codigo AZTRAN

    Energy Technology Data Exchange (ETDEWEB)

    Vargas E, S. [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico); Ibarra, G., E-mail: samuel.vargas@inin.gob.mx [IPN, Av. Instituto Politecnico Nacional s/n, 07738 Ciudad de Mexico (Mexico)

    2017-09-15

    The AZTRAN code began to be developed in the Nuclear Engineering Department of the Escuela Superior de Fisica y Matematicas (ESFM) of the Instituto Politecnico Nacional (IPN) with the purpose of numerically solving various models arising from the physics and engineering of nuclear reactors. The code is still under development and is part of the AZTLAN platform: Development of a Mexican platform for the analysis and design of nuclear reactors. Due to the complexity to generate an input file for the code, a script based on D language is developed, with the purpose of making its elaboration easier, based on a new input file format which includes specific cards, which have been divided into two blocks, mandatory cards and optional cards, including a pre-processing of the input file to identify possible errors within it, as well as an image generator for the specific problem based on the python interpreter. (Author)

  7. Study on Construction of a Medical X-Ray Direct Digital Radiography System and Hybrid Preprocessing Methods

    Directory of Open Access Journals (Sweden)

    Yong Ren

    2014-01-01

    Full Text Available We construct a medical X-ray direct digital radiography (DDR system based on a CCD (charge-coupled devices camera. For the original images captured from X-ray exposure, computer first executes image flat-field correction and image gamma correction, and then carries out image contrast enhancement. A hybrid image contrast enhancement algorithm which is based on sharp frequency localization-contourlet transform (SFL-CT and contrast limited adaptive histogram equalization (CLAHE, is proposed and verified by the clinical DDR images. Experimental results show that, for the medical X-ray DDR images, the proposed comprehensive preprocessing algorithm can not only greatly enhance the contrast and detail information, but also improve the resolution capability of DDR system.

  8. Use of apparent thickness for preprocessing of low-frequency electromagnetic data in inversion-based multibarrier evaluation workflow

    Science.gov (United States)

    Omar, Saad; Omeragic, Dzevat

    2018-04-01

    The concept of apparent thicknesses is introduced for the inversion-based, multicasing evaluation interpretation workflow using multifrequency and multispacing electromagnetic measurements. A thickness value is assigned to each measurement, enabling the development of two new preprocessing algorithms to remove casing collar artifacts. First, long-spacing apparent thicknesses are used to remove, from the pipe sections, artifacts ("ghosts") caused by the transmitter crossing a casing collar or corrosion. Second, a collar identification, localization, and assignment algorithm is developed to enable robust inversion in collar sections. Last, casing eccentering can also be identified on the basis of opposite deviation of short-spacing phase and magnitude apparent thicknesses from the nominal value. The proposed workflow can handle an arbitrary number of nested casings and has been validated on synthetic and field data.

  9. Effect of interpolation error in pre-processing codes on calculations of self-shielding factors and their temperature derivatives

    International Nuclear Information System (INIS)

    Ganesan, S.; Gopalakrishnan, V.; Ramanadhan, M.M.; Cullan, D.E.

    1986-01-01

    We investigate the effect of interpolation error in the pre-processing codes LINEAR, RECENT and SIGMA1 on calculations of self-shielding factors and their temperature derivatives. We consider the 2.0347 to 3.3546 keV energy region for 238 U capture, which is the NEACRP benchmark exercise on unresolved parameters. The calculated values of temperature derivatives of self-shielding factors are significantly affected by interpolation error. The sources of problems in both evaluated data and codes are identified and eliminated in the 1985 version of these codes. This paper helps to (1) inform code users to use only 1985 versions of LINEAR, RECENT, and SIGMA1 and (2) inform designers of other code systems where they may have problems and what to do to eliminate their problems. (author)

  10. Effect of interpolation error in pre-processing codes on calculations of self-shielding factors and their temperature derivatives

    International Nuclear Information System (INIS)

    Ganesan, S.; Gopalakrishnan, V.; Ramanadhan, M.M.; Cullen, D.E.

    1985-01-01

    The authors investigate the effect of interpolation error in the pre-processing codes LINEAR, RECENT and SIGMA1 on calculations of self-shielding factors and their temperature derivatives. They consider the 2.0347 to 3.3546 keV energy region for /sup 238/U capture, which is the NEACRP benchmark exercise on unresolved parameters. The calculated values of temperature derivatives of self-shielding factors are significantly affected by interpolation error. The sources of problems in both evaluated data and codes are identified and eliminated in the 1985 version of these codes. This paper helps to (1) inform code users to use only 1985 versions of LINEAR, RECENT, and SIGMA1 and (2) inform designers of other code systems where they may have problems and what to do to eliminate their problems

  11. Energy savings certificates 2011-2013 - Companies. Knowledge for action

    International Nuclear Information System (INIS)

    2013-03-01

    As fossil energy resources are depleted and the environmental impacts of their use are increasingly addressed, energy costs will continue to rise. In this context it is vital for businesses - in the service sector, industry or agriculture - to take steps now to start managing their energy consumption. A number of tools and mechanisms are currently being implemented to help companies in this process, at the national and European levels. Among these, Energy Savings Certificates (ESCs) were introduced in France by the Energy Policy Law of 13 July 2005, with the aim of achieving energy savings in sectors of dispersed activity, mainly buildings, but also light industry, agriculture and transport. For businesses this mechanism is an additional financial leverage tool that can be used to support their energy management projects. Under this scheme energy suppliers must promote energy-efficient investments, and thus are potential sources of financing for project owners. The Grenelle environmental conference forcefully reiterated the need to take action, in particular to renovate existing building stock. In order to achieve the ambitious goals that have been set, the financial mechanisms put into place, including the ESC scheme, must be amplified. Following the first conclusive test period (2006-2009), the ESC scheme is being ramped up during a second and more ambitious three-year period that began on 1 January 2011. The present document is intended to inform companies of changes in the ESC scheme to be implemented for the second period covering 2011-2013. This guidance is divided into two parts: the first section describes the principles of the ESC scheme, and the second offers advice to companies that want to use this scheme for an energy management project. You will also find a practical information sheet listing all the steps to be taken to submit an ESC claim

  12. QSpike Tools: a Generic Framework for Parallel Batch Preprocessing of Extracellular Neuronal Signals Recorded by Substrate Microelectrode Arrays

    Directory of Open Access Journals (Sweden)

    Mufti eMahmud

    2014-03-01

    Full Text Available Micro-Electrode Arrays (MEAs have emerged as a mature technique to investigate brain (dysfunctions in vivo and in in vitro animal models. Often referred to as smart Petri dishes, MEAs has demonstrated a great potential particularly for medium-throughput studies in vitro, both in academic and pharmaceutical industrial contexts. Enabling rapid comparison of ionic/pharmacological/genetic manipulations with control conditions, MEAs are often employed to screen compounds by monitoring non-invasively the spontaneous and evoked neuronal electrical activity in longitudinal studies, with relatively inexpensive equipment. However, in order to acquire sufficient statistical significance, recordings last up to tens of minutes and generate large amount of raw data (e.g., 60 channels/MEA, 16 bits A/D conversion, 20kHz sampling rate: ~8GB/MEA,h uncompressed. Thus, when the experimental conditions to be tested are numerous, the availability of fast, standardized, and automated signal preprocessing becomes pivotal for any subsequent analysis and data archiving. To this aim, we developed an in-house cloud-computing system, named QSpike Tools, where CPU-intensive operations, required for preprocessing of each recorded channel (e.g., filtering, multi-unit activity detection, spike-sorting, etc., are decomposed and batch-queued to a multi-core architecture or to computer cluster. With the commercial availability of new and inexpensive high-density MEAs, we believe that disseminating QSpike Tools might facilitate its wide adoption and customization, and possibly inspire the creation of community-supported cloud-computing facilities for MEAs users.

  13. QSpike tools: a generic framework for parallel batch preprocessing of extracellular neuronal signals recorded by substrate microelectrode arrays.

    Science.gov (United States)

    Mahmud, Mufti; Pulizzi, Rocco; Vasilaki, Eleni; Giugliano, Michele

    2014-01-01

    Micro-Electrode Arrays (MEAs) have emerged as a mature technique to investigate brain (dys)functions in vivo and in in vitro animal models. Often referred to as "smart" Petri dishes, MEAs have demonstrated a great potential particularly for medium-throughput studies in vitro, both in academic and pharmaceutical industrial contexts. Enabling rapid comparison of ionic/pharmacological/genetic manipulations with control conditions, MEAs are employed to screen compounds by monitoring non-invasively the spontaneous and evoked neuronal electrical activity in longitudinal studies, with relatively inexpensive equipment. However, in order to acquire sufficient statistical significance, recordings last up to tens of minutes and generate large amount of raw data (e.g., 60 channels/MEA, 16 bits A/D conversion, 20 kHz sampling rate: approximately 8 GB/MEA,h uncompressed). Thus, when the experimental conditions to be tested are numerous, the availability of fast, standardized, and automated signal preprocessing becomes pivotal for any subsequent analysis and data archiving. To this aim, we developed an in-house cloud-computing system, named QSpike Tools, where CPU-intensive operations, required for preprocessing of each recorded channel (e.g., filtering, multi-unit activity detection, spike-sorting, etc.), are decomposed and batch-queued to a multi-core architecture or to a computers cluster. With the commercial availability of new and inexpensive high-density MEAs, we believe that disseminating QSpike Tools might facilitate its wide adoption and customization, and inspire the creation of community-supported cloud-computing facilities for MEAs users.

  14. 12 CFR 575.10 - Acquisition and disposition of savings associations, savings and loan holding companies, and...

    Science.gov (United States)

    2010-01-01

    ... associations, savings and loan holding companies, and other corporations by mutual holding companies. 575.10... COMPANIES § 575.10 Acquisition and disposition of savings associations, savings and loan holding companies... savings and loan holding company in the stock form that is not a subsidiary holding company, provided the...

  15. Boiler house modernization through shared savings program

    Energy Technology Data Exchange (ETDEWEB)

    Breault, R.W. [Tecogen, Waltham, MA (United States)

    1995-12-31

    Throughout Poland as well as the rest of Eastern Europe, communities and industries rely on small heat only boilers to provide district and process heat. Together these two sectors produce about 85,000 MW from boilers in the 2 to 35 MW size range. The bulk of these units were installed prior to 1992 and must be completely overhauled to meet the emission regulations which will be coming into effect on January 1, 1998. Since the only practical fuel is coal in most cases, these boilers must be either retrofit with emission control technology or be replaced entirely. The question that arises is how to accomplish this given the current tight control of capital in Poland and other East European countries. A solution that we have for this problem is shared savings. These boilers are typically operating with a quiet low efficiency as compared to western standards and with excessive manual labor. Installing modernization equipment to improve the efficiency and to automate the process provides savings. ECOGY provides the funds for the modernization to improve the efficiency, add automation and install emission control equipment. The savings that are generated during the operation of the modernized boiler system are split between the client company and ECOGY for a number of years and then the system is turned over in entirety to the client. Depending on the operating capacity, the shared savings agreement will usually span 6 to 10 years.

  16. Save Beady Kid from the Sun

    Science.gov (United States)

    Demetrikopoulos, Melissa; Thompson, Wesley; Pecore, John

    2017-01-01

    Art and science help students investigate light energy and practice fair testing. With the goal of finding a way to save "Beady Kid" from invisible rays, students used science practices to investigate the transfer of light energy from the Sun. During this art-integrated science lesson presented in this article, upper elementary (grades…

  17. Water savings through off-farm employment?

    NARCIS (Netherlands)

    Wachong Castro, V.; Heerink, N.; Shi, X.; Qu, W.

    2010-01-01

    Purpose – The purpose of this paper is to gain more insight into the relationship between off-farm employment of rural households and water-saving investments and irrigation water use in rural China. Design/methodology/approach – Data from a survey held among 317 households in Minle County, Zhangye

  18. Use staff wisely to save NHS money.

    Science.gov (United States)

    Moore, Alison

    2015-12-09

    The NHS could save up to £ 2 billion a year by improving workflow and containing workforce costs, according to Labour peer Lord Carter's review of NHS efficiency. Changes in areas such as rostering and management of annual leave must avoid increasing the pressure on staff.

  19. Charter School Spending and Saving in California

    Science.gov (United States)

    Reed, Sherrie; Rose, Heather

    2015-01-01

    Examining resource allocation practices, including savings, of charter schools is critical to understanding their financial viability and sustainability. Using 9 years of finance data from California, we find charter schools spend less on instruction and pupil support services than traditional public schools. The lower spending on instruction and…

  20. Tip Saves Energy, Money for Pennsylvania Plant

    Science.gov (United States)

    A wastewater treatment plant in Berks County, Pennsylvania is saving nearly $45,000 a year and reducing hundreds of metric tons of greenhouse gases since employing an energy conservation tip offered by the Water Protection Division in EPA’s R3 and PADEP.

  1. How to save money on medicines

    Science.gov (United States)

    U.S. Food and Drug Administration (FDA) website. Saving money on prescription drugs. www.fda.gov/Drugs/EmergencyPreparedness/BioterrorismandDrugPreparedness/ucm134215.htm . Updated May 4, 2016. Accessed October 14, 2016. U.S. Food and Drug ...

  2. Energy saving baking methods. Energibesparende bagemetoder

    Energy Technology Data Exchange (ETDEWEB)

    Gry, P.

    1988-01-01

    The project ''Energy Saving Baking Methods'', run as part of the Energy Research Project-1984, and has as its aim to investigate potentials for energy saving by employing microwaves in the baking process. The project is a follow-up of the Nordic Industry Fund project which was completed in 1983. Smaller test ovens with IR long waves, warm air convection and microwaves of 2,47 GHz were used. Measurements of heat distribution from all three energy sources have been made. Extensive experiments have been carried out in order to develope baking methods for white loaves which are energy saving, but where the quality of the bread does not undergo any form of deterioration. Tests were made using microwaves alone, and in combination with hot air and IR. A resulting saving 35% baking time was achieved, and a further reduction of baking time can be reached where a greater improvement of energy distribution can take place, especially in the case of microwaves and IR. (AB).

  3. Savings impact of a corporate energy manager

    International Nuclear Information System (INIS)

    Sikorski, B.D.; O'Donnell, B.A.

    1999-01-01

    This paper discusses the cost savings impact of employing an energy manager with a 16,000-employee corporation. The corporation, Canada's second largest airline, is currently operating nearly 3,000,000 ft 2 of mixed-use facilities spread across the country, with an annual energy budget for ground facilities of over Cdn $4,000,000. This paper outlines the methodology used by the energy manager to deploy an energy management program over a two-year period between April 1995 and May 1997. The paper examines the successes and the lessons learned during the period and summarizes the costs and benefits of the program. The energy manager position was responsible for developing an energy history database with more than 100 active accounts and for monitoring and verifying energy savings. The energy manager implemented many relatively low-cost energy conservation measures, as well as some capital projects, during the first two years of the program. In total, these measures provided energy cost savings of $210,000 per year, or 5% of the total budget. In each case, technologies installed as part of the energy retrofit projects provided not only cost savings but also better control, reduced maintenance, and improved working conditions for employees

  4. Food Processing Contracts: Savings for Schools.

    Science.gov (United States)

    Van Egmond-Pannell, Dorothy

    1983-01-01

    Food processing contracts between schools and food manufacturers can result in huge cost savings. Fairfax County, Virginia, is one of 30 "letter of credit" sites in a three-year study of alternatives. After one year it appears that schools can purchase more for the dollar in their local areas. (MD)

  5. How Palo Verde saved millions of dollars

    International Nuclear Information System (INIS)

    Anon.

    1994-01-01

    In autumn 1992, Arizona Public Service adopted new project control procedures for outages at its three-unit Palo Verde PWR site, including: the switch to a PC-based environment; a new scheduling system; and the generation of improved graphics for decision making. Major cost savings were made and three further outages have now benefitted from the new systems. (author)

  6. Communication Can Save Lives PSA (:60)

    Centers for Disease Control (CDC) Podcasts

    This 60 second public service announcement (PSA) is based on the August 2015 CDC Vital Signs report. Antibiotic-resistant germs cause at least 23,000 deaths each year. Learn how public health authorities and health care facilities can work together to save lives.

  7. Finding Savings in Community Use of Schools

    Science.gov (United States)

    Gandy, Julia

    2013-01-01

    This article reports on the growing challenge of managing community groups using educational facilities for meetings, athletics, and special events. It describes how, by using an online scheduling software program, one school district was able to track payments and save time and money with its event and facility scheduling process.

  8. Opinion: Composition Studies Saves the World!

    Science.gov (United States)

    Bizzell, Patricia

    2009-01-01

    Stanley Fish in his new book ["Save the World on Your Own Time" (New York: Oxford UP,2008)] says that composition studies presents "the clearest example" of what is desperately wrong in the academy, because in writing classrooms, he says, "more often than not anthologies of provocative readings take center stage and the actual teaching of writing…

  9. Estimates of Savings Achievable from Irrigation Controller

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Alison; Fuchs, Heidi; Whitehead, Camilla Dunham

    2014-03-28

    This paper performs a literature review and meta-analysis of water savings from several types of advanced irrigation controllers: rain sensors (RS), weather-based irrigation controllers (WBIC), and soil moisture sensors (SMS).The purpose of this work is to derive average water savings per controller type, based to the extent possible on all available data. After a preliminary data scrubbing, we utilized a series of analytical filters to develop our best estimate of average savings. We applied filters to remove data that might bias the sample such as data self-reported by manufacturers, data resulting from studies focusing on high-water users, or data presented in a non-comparable format such as based on total household water use instead of outdoor water use. Because the resulting number of studies was too small to be statistically significant when broken down by controller type, this paper represents a survey and synthesis of available data rather than a definitive statement regarding whether the estimated water savings are representative.

  10. Analysis of savings mobilization determinants among rural ...

    African Journals Online (AJOL)

    The result showed that 90% were within the age group of 30 to 69 years while about 79% ... The average family size was six while about 37% of the sampled household heads had ... Key words: Savings mobilization, income and determinants ...

  11. Saving energy: bringing down Europe's enery prices

    NARCIS (Netherlands)

    Molenbroek, E.; Blok, K.

    2012-01-01

    In June 2011 the European Commission proposed a new Directive on Energy Efficiency. Its purpose is to put forward a framework to deliver the EU’s target of reducing its energy consumption by 20% by 2020. Currently, the EU is only on track to achieve half of those savings. Apart from the

  12. Reactors Save Energy, Costs for Hydrogen Production

    Science.gov (United States)

    2014-01-01

    While examining fuel-reforming technology for fuel cells onboard aircraft, Glenn Research Center partnered with Garrettsville, Ohio-based Catacel Corporation through the Glenn Alliance Technology Exchange program and a Space Act Agreement. Catacel developed a stackable structural reactor that is now employed for commercial hydrogen production and results in energy savings of about 20 percent.

  13. Exploring Demand Charge Savings from Commercial Solar

    Energy Technology Data Exchange (ETDEWEB)

    Darghouth, Naim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Mills, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gagnon, Pieter [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bird, Lori [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-07-31

    Commercial retail electricity rates commonly include a demand charge component, based on some measure of the customer’s peak demand. Customer-sited solar PV can potentially reduce demand charges, but the magnitude of these savings can be difficult to predict, given variations in demand charge designs, customer loads, and PV generation profiles. Moreover, depending on the circumstances, demand charges from solar may or may not align well with associated utility cost savings. Lawrence Berkeley National Laboratory (Berkeley Lab) and the National Renewable Energy Laboratory (NREL) are collaborating in a series of studies to understand how solar PV can reduce demand charge levels for a variety of customer types and demand charges designs. Previous work focused on residential customs with solar. This study, instead, focuses on commercial customers and seeks to understand the extent and conditions under which rooftop can solar reduce commercial demand charges. To answer these questions, we simulate demand charge savings for a broad range of commercial customer types, demand charge designs, locations, and PV system characteristics. This particular analysis does not include storage, but a subsequent analysis in this series will evaluate demand charge savings for commercial customers with solar and storage.

  14. Saving and credit cooperatives in Mexico

    Directory of Open Access Journals (Sweden)

    Martha E. Izquierdo Muciño

    2015-11-01

    Full Text Available The Saving service and Credit Popular (Cajas de Ahorro y Credito Popular first appeared in Mexico in 1951 at the initiative of the priest Pedro Velazquez, similar to the popular savings banks that existed in Canada, which had been founded by Alphonse Desjardins in the early XX century.These savings service (Cajas de Ahorro were developing successfully in almost all Mexican communities, most of them remained faithful to the principles and ordinances of the church that gave rise to them, without the government participate in this activity and without policies encouraging or regulatory actions. It was not until 1991 when the General Law of Organizations and Auxiliary Credit Activities (Ley General de Organizaciones y Actividades was enacted. However in 2000 they began to emerge problems arising from fraudulent activities by some people taking advantage of loopholes established irregular saving services. Consequently and in order to solve these problems was changing the law. The last of the fraud occurred with the popular financial called FICREA (2015, to which was amended regulatory law again and while thus sought to avoid another fraud, who really was affected is a large amount very poor indigenous and peasants.Received: 03.06.2015Accepted: 17.07.2015

  15. Acquiring energy savings in manufactured housing

    International Nuclear Information System (INIS)

    Davey, D.

    1993-01-01

    In 1991, the Northwest utilities faced a complex situation. They needed new sources of electrical power to avoid future deficits. A significant block of energy savings was available in the manufactured housing sector in the form of energy savings from increased insulation to new manufactured homes. The manufacturers were interested in saving the electricity in the homes, but would only deal with the utility sector as a whole. Half of the homes targeted were sited in investor-owned utility (IOU) service territories, and half in the public sector made up of utilities that purchased some or all of their electricity from the Bonneville Power Administration. Utilities agreed to acquire energy from manufacturers In the form of thermal efficiency measures specified by the Bonneville Power Administration. The program that resulted from over one year of negotiations was called the Manufactured Housing Acquisition Program, or MAP. Manufacturers, the utilities, State Energy Offices, the Northwest Power Planning Council and Bonneville all worked closely and with tenacity to build the program that went into effect on April 1, 1992, and should save the region between 7 and 9 megawatts, enough energy to supply 11,000 homes in the Northwest

  16. C. h. p. saves fuel in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Daugas, C F

    1979-04-01

    A combined heat and power plant based on a diesel generator to produce 12MW of electricity and 12MW of heat has successfully supplied the town of Skultuna in Sweden during the winter and has saved 3,700t of oil annually.

  17. Do You Automate? Saving Time and Dollars

    Science.gov (United States)

    Carmichael, Christine H.

    2010-01-01

    An automated workforce management strategy can help schools save jobs, improve the job satisfaction of teachers and staff, and free up precious budget dollars for investments in critical learning resources. Automated workforce management systems can help schools control labor costs, minimize compliance risk, and improve employee satisfaction.…

  18. Should Reformers Support Education Savings Accounts?

    Science.gov (United States)

    Ladner, Matthew; Smith, Nelson

    2016-01-01

    In this article, "Education Next" talks with Matthew Ladner and Nelson Smith on the topic of Education Savings Accounts (ESAs). ESAs apply the logic of school choice to the ever-expanding realm of education offerings. Rather than simply empowering families to select the school of their choice, ESAs provide families with most or all of…

  19. 78 FR 20097 - Energy Savings Performance Contracts

    Science.gov (United States)

    2013-04-03

    ... procedures, scope definition, Measurement and Verification (M&V), financing procurement, and definition of... government. More than $2.71 billion has been invested in Federal energy efficiency and renewable energy... more than $7.18 billion of cumulative energy cost savings for the Federal Government. While FEMP has...

  20. Advertising energy saving programs: The potential environmental cost of emphasizing monetary savings.

    Science.gov (United States)

    Schwartz, Daniel; Bruine de Bruin, Wändi; Fischhoff, Baruch; Lave, Lester

    2015-06-01

    Many consumers have monetary or environmental motivations for saving energy. Indeed, saving energy produces both monetary benefits, by reducing energy bills, and environmental benefits, by reducing carbon footprints. We examined how consumers' willingness and reasons to enroll in energy-savings programs are affected by whether advertisements emphasize monetary benefits, environmental benefits, or both. From a normative perspective, having 2 noteworthy kinds of benefit should not decrease a program's attractiveness. In contrast, psychological research suggests that adding external incentives to an intrinsically motivating task may backfire. To date, however, it remains unclear whether this is the case when both extrinsic and intrinsic motivations are inherent to the task, as with energy savings, and whether removing explicit mention of extrinsic motivation will reduce its importance. We found that emphasizing a program's monetary benefits reduced participants' willingness to enroll. In addition, participants' explanations about enrollment revealed less attention to environmental concerns when programs emphasized monetary savings, even when environmental savings were also emphasized. We found equal attention to monetary motivations in all conditions, revealing an asymmetric attention to monetary and environmental motives. These results also provide practical guidance regarding the positioning of energy-saving programs: emphasize intrinsic benefits; the extrinsic ones may speak for themselves. (c) 2015 APA, all rights reserved).

  1. Evaluating the benefits of digital pathology implementation: Time savings in laboratory logistics.

    Science.gov (United States)

    Baidoshvili, Alexi; Bucur, Anca; van Leeuwen, Jasper; van der Laak, Jeroen; Kluin, Philip; van Diest, Paul J

    2018-06-20

    The benefits of digital pathology for workflow improvement and thereby cost savings in pathology, at least partly outweighing investment costs, are increasingly recognized. Successful implementations in a variety of scenarios start to demonstrate cost benefits of digital pathology for both research and routine diagnostics, contributing to a sound business case encouraging further adoption. To further support new adopters, there is still a need for detailed assessment of the impact this technology has on the relevant pathology workflows with emphasis on time saving. To assess the impact of digital pathology adoption on logistic laboratory tasks (i.e. not including pathologists' time for diagnosis making) in LabPON, a large regional pathology laboratory in The Netherlands. To quantify the benefits of digitization we analyzed the differences between the traditional analog and new digital workflows, carried out detailed measurements of all relevant steps in key analog and digital processes, and compared time spent. We modeled and assessed the logistic savings in five workflows: (1) Routine diagnosis, (2) Multi-disciplinary meeting, (3) External revision requests, (4) Extra stainings and (5) External consultation. On average over 19 working hours were saved on a typical day by working digitally, with the highest savings in routine diagnosis and multi-disciplinary meeting workflows. By working digitally, a significant amount of time could be saved in a large regional pathology lab with a typical case mix. We also present the data in each workflow per task and concrete logistic steps to allow extrapolation to the context and case mix of other laboratories. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  2. Linear step drive

    International Nuclear Information System (INIS)

    Haniger, L.; Elger, R.; Kocandrle, L.; Zdebor, J.

    1986-01-01

    A linear step drive is described developed in Czechoslovak-Soviet cooperation and intended for driving WWER-1000 control rods. The functional principle is explained of the motor and the mechanical and electrical parts of the drive, power control, and the indicator of position are described. The motor has latches situated in the reactor at a distance of 3 m from magnetic armatures, it has a low structural height above the reactor cover, which suggests its suitability for seismic localities. Its magnetic circuits use counterpoles; the mechanical shocks at the completion of each step are damped using special design features. The position indicator is of a special design and evaluates motor position within ±1% of total travel. A drive diagram and the flow chart of both the control electronics and the position indicator are presented. (author) 4 figs

  3. Computational Abstraction Steps

    DEFF Research Database (Denmark)

    Thomsen, Lone Leth; Thomsen, Bent; Nørmark, Kurt

    2010-01-01

    and class instantiations. Our teaching experience shows that many novice programmers find it difficult to write programs with abstractions that materialise to concrete objects later in the development process. The contribution of this paper is the idea of initiating a programming process by creating...... or capturing concrete values, objects, or actions. As the next step, some of these are lifted to a higher level by computational means. In the object-oriented paradigm the target of such steps is classes. We hypothesise that the proposed approach primarily will be beneficial to novice programmers or during...... the exploratory phase of a program development process. In some specific niches it is also expected that our approach will benefit professional programmers....

  4. Step 3: Manage Your Diabetes

    Science.gov (United States)

    ... please turn JavaScript on. Feature: Type 2 Diabetes Step 3: Manage Your Diabetes Past Issues / Fall 2014 ... 2 Diabetes" Articles Diabetes Is Serious But Manageable / Step 1: Learn About Diabetes / Step 2: Know Your ...

  5. Efficacy of New Measures Saving Time in Acute Stroke Management: A Quantified Analysis.

    Science.gov (United States)

    Iglesias Mohedano, Ana María; García Pastor, Andrés; Díaz Otero, Fernando; Vázquez Alen, Pilar; Vales Montero, Marta; Luque Buzo, Elisa; Redondo Ráfales, Nuria; Chavarria Cano, Beatriz; Fernández Bullido, Yolanda; Villanueva Osorio, Jose Antonio; Gil Núñez, Antonio

    2017-08-01

    Time to treatment remains the most important factor in acute ischemic stroke prognosis. We quantified the effect of new interventions reducing in-hospital delays in acute stroke management and assessed its repercussion on door-to-imaging (DTI), imaging-to-needle (ITN), and door-to-needle (DTN) times. Prospective registry of consecutive stroke patients who were candidates for reperfusion therapy attended in a tertiary care hospital from February 1 to December 31, 2014. A series of measures aimed at reducing in-hospital delays were implemented. We compared DTI, ITN, and DTN times between patients who underwent the interventions and those who did not. 231 patients. DTI time was lower when personal history was reviewed and tests were ordered before patient arrival (2.5 minutes saved, P = .016) and when electrocardiogram was not made (5.4 minutes saved, P time significantly (14 and 12 minutes saved, respectively, P time. Completing all steps resulted in the lowest DTI and ITN times (13 and 19 minutes, respectively). Every measure is an important part of a chain focused on saving time in acute stroke: the lowest DTI and ITN times were obtained when all steps were completed. Measures shortening ITN time produced a greater impact on DTN time reduction; therefore, ITN interventions should be considered a critical part of new protocols and guidelines. Copyright © 2017. Published by Elsevier Inc.

  6. Key issues in estimating energy and greenhouse gas savings of biofuels: challenges and perspectives

    Directory of Open Access Journals (Sweden)

    Dheeraj Rathore

    2016-06-01

    Full Text Available The increasing demand for biofuels has encouraged the researchers and policy makers worldwide to find sustainable biofuel production systems in accordance with the regional conditions and needs. The sustainability of a biofuel production system includes energy and greenhouse gas (GHG saving along with environmental and social acceptability. Life cycle assessment (LCA is an internationally recognized tool for determining the sustainability of biofuels. LCA includes goal and scope, life cycle inventory, life cycle impact assessment, and interpretation as major steps. LCA results vary significantly, if there are any variations in performing these steps. For instance, biofuel producing feedstocks have different environmental values that lead to different GHG emission savings and energy balances. Similarly, land-use and land-use changes may overestimate biofuel sustainability. This study aims to examine various biofuel production systems for their GHG savings and energy balances, relative to conventional fossil fuels with an ambition to address the challenges and to offer future directions for LCA based biofuel studies. Environmental and social acceptability of biofuel production is the key factor in developing biofuel support policies. Higher GHG emission saving and energy balance of biofuel can be achieved, if biomass yield is high, and ecologically sustainable biomass or non-food biomass is converted into biofuel and used efficiently.

  7. Using primary care electronic health record data for comparative effectiveness research : experience of data quality assessment and preprocessing in The Netherlands

    NARCIS (Netherlands)

    Huang, Yunyu; Voorham, Jaco; Haaijer-Ruskamp, Flora M.

    Aim: Details of data quality and how quality issues were solved have not been reported in published comparative effectiveness studies using electronic health record data. Methods: We developed a conceptual framework of data quality assessment and preprocessing and apply it to a study comparing

  8. Computational Testing for Automated Preprocessing 2: Practical Demonstration of a System for Scientific Data-Processing Workflow Management for High-Volume EEG.

    Science.gov (United States)

    Cowley, Benjamin U; Korpela, Jussi

    2018-01-01

    Existing tools for the preprocessing of EEG data provide a large choice of methods to suitably prepare and analyse a given dataset. Yet it remains a challenge for the average user to integrate methods for batch processing of the increasingly large datasets of modern research, and compare methods to choose an optimal approach across the many possible parameter configurations. Additionally, many tools still require a high degree of manual decision making for, e.g., the classification of artifacts in channels, epochs or segments. This introduces extra subjectivity, is slow, and is not reproducible. Batching and well-designed automation can help to regularize EEG preprocessing, and thus reduce human effort, subjectivity, and consequent error. The Computational Testing for Automated Preprocessing (CTAP) toolbox facilitates: (i) batch processing that is easy for experts and novices alike; (ii) testing and comparison of preprocessing methods. Here we demonstrate the application of CTAP to high-resolution EEG data in three modes of use. First, a linear processing pipeline with mostly default parameters illustrates ease-of-use for naive users. Second, a branching pipeline illustrates CTAP's support for comparison of competing methods. Third, a pipeline with built-in parameter-sweeping illustrates CTAP's capability to support data-driven method parameterization. CTAP extends the existing functions and data structure from the well-known EEGLAB toolbox, based on Matlab, and produces extensive quality control outputs. CTAP is available under MIT open-source licence from https://github.com/bwrc/ctap.

  9. Energy saving and cost saving cooling; Energie und Kosten sparende Kuehlung

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, Klaus W. [Architektur- und Fachpressebuero Klaus W. Koenig, Ueberlingen (Germany)

    2012-07-01

    In the case of cost reduction, energy conservation and resource savings, rain water is an ideal medium offering more advantages in comparison to the cooling with drinking water. There are no fees for the drinking water and drainage of rain water. It is not necessary to soften rain water so that further operational costs for the treatment and drainage of waste water can be saved. The avoidance of the related material flows and necessary energy is a practiced environmental protection and climate protection.

  10. Stepping Stones through Time

    Directory of Open Access Journals (Sweden)

    Emily Lyle

    2012-03-01

    Full Text Available Indo-European mythology is known only through written records but it needs to be understood in terms of the preliterate oral-cultural context in which it was rooted. It is proposed that this world was conceptually organized through a memory-capsule consisting of the current generation and the three before it, and that there was a system of alternate generations with each generation taking a step into the future under the leadership of a white or red king.

  11. SYSTEMATIZATION OF THE BASIC STEPS OF THE STEP-AEROBICS

    Directory of Open Access Journals (Sweden)

    Darinka Korovljev

    2011-03-01

    Full Text Available Following the development of the powerful sport industry, in front of us appeared a lot of new opportunities for creating of the new programmes of exercising with certain requisites. One of such programmes is certainly step-aerobics. Step-aerobics can be defined as a type of aerobics consisting of the basic aerobic steps (basic steps applied in exercising on stepper (step bench, with a possibility to regulate its height. Step-aerobics itself can be divided into several groups, depending on the following: type of music, working methods and adopted knowledge of the attendants. In this work, the systematization of the basic steps in step-aerobics was made on the basis of the following criteria: steps origin, number of leg motions in stepping and relating the body support at the end of the step. Systematization of the basic steps of the step-aerobics is quite significant for making a concrete review of the existing basic steps, thus making creation of the step-aerobics lesson easier

  12. Saved Leave Scheme (SLS) : Simplified procedure for the transfer of leave to saved leave accounts

    CERN Multimedia

    HR Division

    2001-01-01

    As part of the process of streamlining procedures, the HR and AS Divisions have jointly developed a system whereby annual and compensatory leave will henceforth be automatically transferred1) to saved leave accounts. Under the provisions of the voluntary saved leave scheme (SLS), a maximum total of 10 days'2) annual and compensatory leave (excluding saved leave accumulated in accordance with the provisions of Administrative Circular No. 22 B) can be transferred to the saved leave account at the end of the leave year (30 September). Previously, every person taking part in the scheme has been individually issued with a form for the purposes of requesting the transfer of leave to the leave account and the transfer has then had to be done manually by HR Division. To streamline the procedure, unused leave of all those taking part in the saved leave scheme at the closure of of the leave-year accounts will henceforth be transferred automatically to the saved leave account on that date. This simplification is in the ...

  13. Energy Saving Glass Lamination via Selective Radio Frequency Heating

    Energy Technology Data Exchange (ETDEWEB)

    Shawn M. Allan; Patricia M. Strickland; Holly S. Shulman

    2009-11-11

    Ceralink Inc. developed FastFuse™, a rapid, new, energy saving process for lamination of glass and composites using radio frequency (RF) heating technology. The Inventions and Innovations program supported the technical and commercial research and development needed to elevate the innovation from bench scale to a self-supporting technology with significant potential for growth. The attached report provides an overview of the technical and commerical progress achieved for FastFuse™ during the course of the project. FastFuse™ has the potential to revolutionize the laminate manufacturing industries by replacing energy intensive, multi-step processes with an energy efficient, single-step process that allows higher throughput. FastFuse™ transmits RF energy directly into the interlayer to generate heat, eliminating the need to directly heat glass layers and the surrounding enclosures, such as autoclaves or vacuum systems. FastFuse™ offers lower start-up and energy costs (up to 90% or more reduction in energy costs), and faster cycles times (less than 5 minutes). FastFuse™ is compatible with EVA, TPU, and PVB interlayers, and has been demonstrated for glass, plastics, and multi-material structures such as photovoltaics and transparent armor.

  14. Promoting cooperative federalism through state shared savings.

    Science.gov (United States)

    Weil, Alan

    2013-08-01

    The Affordable Care Act is transforming American federalism and creating strain between the states and the federal government. By expanding the scale of intergovernmental health programs, creating new state requirements, and setting the stage for increased federal fiscal oversight, the act has disturbed an uneasy truce in American federalism. This article outlines a policy proposal designed to harness cooperative federalism, based on the shared state and federal desire to control health care cost growth. The proposal, which borrows features of the Medicare Shared Savings Program, would provide states with an incentive in the form of an increased share of the savings they generate in programs that have federal financial participation, as long as they meet defined performance standards.

  15. Energy saving certificates: an improved instrument

    International Nuclear Information System (INIS)

    2016-02-01

    This report first presents Energy Saving Certificates as one of among other instruments aimed at reducing energy consumption, and indicates how the French consumer is concerned. The benefits of this instrument are outlined: low cost, autonomy, awareness-raising, quantitative assessment of achieved energy savings. Its objectives and results since its creation in 2006 are commented, and the report outlines that this type of instrument is spreading over Europe. The authors show that its efficiency has been improved along the years due to a periodic review of standardised operation sheets, a simplification of the declaration, and an optimization of related programmes. Besides, targets have been better identified. The report outlines that assessment and controls must however be strengthened in order to reduce financial risks and potential drifts. Answers to this report by the concerned minister and ADEME are provided

  16. About instrumental innovation. Energy saving certificates

    International Nuclear Information System (INIS)

    Baiz, Adam; Monnoyer-Smith, Laurence

    2016-09-01

    Energy saving certificates (in French CEE for certificats d'economie d'energie) have been implemented in 2006, and have proven to be a rather efficient tool of public policy, with a low cost, and a good social acceptability, notably due to its hybrid nature (incentive and coercive) in function of the targeted actors. Thus, this document addresses and discusses the interest of such instruments for the public environmental policy. It outlines that instruments tend to be always less coercive, and then comments the example of these energy saving certificates: brief presentation, discussion and explanation of their efficiency, multi-dimensional nature of this instrument, benefits of more or less coercive instruments depending on the actors (State, households, energy providers)

  17. Energy saving in refineries and petrochemical complexes

    Energy Technology Data Exchange (ETDEWEB)

    Verde, L

    1975-01-01

    Possible measures applicable in the design of refineries and petrochemical complexes, to effect energy savings were investigated. This was not limited to the single process unit problems, on the contrary the attention is mainly addressed to the identification of the interrelations between different units, emphasizing possible integrations. Particularly, the optimization of the pressure levels and number of the utility networks for steam distribution inside plant facilities, is considered, in order to maximize heat recovery in the process units, and electric power production in the central steampower generation plant. A computer program of general application, based on profitability evaluation at various fuel oil prices and different project configurations, has been developed for these purposes. The general measures applicable within certain limits are then briefly examined. The task of the process engineer is discussed in the perspective of the ''energy saving'' goal.

  18. Energy saving in industrial varnishing techniques

    International Nuclear Information System (INIS)

    Kirst, W.

    1978-01-01

    The search for more effective varnishing techniques and better varnish surfaces and the increasing consideration of environmental protection, energy and raw materials conservation have helped to promote electron beam hardening. Also the development of high-solid varnishes have brought about the following improvements: Better quality of the varnish surface, possible saving of one layer in multilayer coatings, reduced emission in the waste air of the spray booth, conservation of valuable raw materials and energy. (orig.) [de

  19. Communication Can Save Lives PSA (:60)

    Centers for Disease Control (CDC) Podcasts

    2015-08-04

    This 60 second public service announcement (PSA) is based on the August 2015 CDC Vital Signs report. Antibiotic-resistant germs cause at least 23,000 deaths each year. Learn how public health authorities and health care facilities can work together to save lives.  Created: 8/4/2015 by National Center for Emerging and Zoonotic Infectious Diseases (NCEZID).   Date Released: 8/4/2015.

  20. Structured packing: an opportunity for energy savings

    International Nuclear Information System (INIS)

    Chavez T, R.H.; Guadarrama G, J.J.

    1996-01-01

    This work emphasizes the advantages about the use of structured packing. This type of packings allows by its geometry to reduce the processing time giving energy savings and throw down the production costs in several industries such as heavy water production plants, petrochemical industry and all industries involved with separation processes. There is a comparative results of energy consumption utilizing the structured vs. Raschig packings. (Author)

  1. Energy Savings Potential of Radiative Cooling Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Fernandez, Nicholas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wang, Weimin [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Alvine, Kyle J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Katipamula, Srinivas [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-30

    Pacific Northwest National Laboratory (PNNL), with funding from the U.S. Department of Energy’s (DOE’s) Building Technologies Program (BTP), conducted a study to estimate, through simulation, the potential cooling energy savings that could be achieved through novel approaches to capturing free radiative cooling in buildings, particularly photonic ‘selective emittance’ materials. This report documents the results of that study.

  2. When financiers are concerned with energy saving

    International Nuclear Information System (INIS)

    Chauveau, J.

    2005-01-01

    Innovative financial systems allow to finance investments for the energy efficiency improvement of public or residential buildings. Such solutions are implemented in Belgium and Germany. They are based on the association between a financial company, an energy supplier who makes an energy audit and the building owner who refunds the investments with the saving made on the space heating and power consumption of the building. Short paper. (J.S.)

  3. A graphical method to evaluate spectral preprocessing in multivariate regression calibrations: example with Savitzky-Golay filters and partial least squares regression.

    Science.gov (United States)

    Delwiche, Stephen R; Reeves, James B

    2010-01-01

    In multivariate regression analysis of spectroscopy data, spectral preprocessing is often performed to reduce unwanted background information (offsets, sloped baselines) or accentuate absorption features in intrinsically overlapping bands. These procedures, also known as pretreatments, are commonly smoothing operations or derivatives. While such operations are often useful in reducing the number of latent variables of the actual decomposition and lowering residual error, they also run the risk of misleading the practitioner into accepting calibration equations that are poorly adapted to samples outside of the calibration. The current study developed a graphical method to examine this effect on partial least squares (PLS) regression calibrations of near-infrared (NIR) reflection spectra of ground wheat meal with two analytes, protein content and sodium dodecyl sulfate sedimentation (SDS) volume (an indicator of the quantity of the gluten proteins that contribute to strong doughs). These two properties were chosen because of their differing abilities to be modeled by NIR spectroscopy: excellent for protein content, fair for SDS sedimentation volume. To further demonstrate the potential pitfalls of preprocessing, an artificial component, a randomly generated value, was included in PLS regression trials. Savitzky-Golay (digital filter) smoothing, first-derivative, and second-derivative preprocess functions (5 to 25 centrally symmetric convolution points, derived from quadratic polynomials) were applied to PLS calibrations of 1 to 15 factors. The results demonstrated the danger of an over reliance on preprocessing when (1) the number of samples used in a multivariate calibration is low (<50), (2) the spectral response of the analyte is weak, and (3) the goodness of the calibration is based on the coefficient of determination (R(2)) rather than a term based on residual error. The graphical method has application to the evaluation of other preprocess functions and various

  4. SPAR-H Step-by-Step Guidance

    International Nuclear Information System (INIS)

    Galyean, W.J.; Whaley, A.M.; Kelly, D.L.; Boring, R.L.

    2011-01-01

    This guide provides step-by-step guidance on the use of the SPAR-H method for quantifying Human Failure Events (HFEs). This guide is intended to be used with the worksheets provided in: 'The SPAR-H Human Reliability Analysis Method,' NUREG/CR-6883, dated August 2005. Each step in the process of producing a Human Error Probability (HEP) is discussed. These steps are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff. The discussions on dependence are extensive and include an appendix that describes insights obtained from the psychology literature.

  5. SPAR-H Step-by-Step Guidance

    Energy Technology Data Exchange (ETDEWEB)

    W. J. Galyean; A. M. Whaley; D. L. Kelly; R. L. Boring

    2011-05-01

    This guide provides step-by-step guidance on the use of the SPAR-H method for quantifying Human Failure Events (HFEs). This guide is intended to be used with the worksheets provided in: 'The SPAR-H Human Reliability Analysis Method,' NUREG/CR-6883, dated August 2005. Each step in the process of producing a Human Error Probability (HEP) is discussed. These steps are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff. The discussions on dependence are extensive and include an appendix that describes insights obtained from the psychology literature.

  6. Energy saving estimation on radiation process

    International Nuclear Information System (INIS)

    Kaneko, Hideaki; Maekawa, H.; Ito, Y.; Nishikawa, I.; Fujii, H.; Murata, K.

    1982-01-01

    When the quantity of paint used for industrial coating is assumed to be 420,000 tons, it is estimated that the area being coated is 2.8 billion m 2 , the petroleum required for pretreatment steam, drying and baking is 1.68 million tons, and the required amount of energy saving is 120,000 tons per year in terms of petroleum. The authors examined how the adoption of electron beam curing for surface coating contributes to the energy saving. So far, it has been said that electron beam curing is more efficient than thermal or light curing in energy consumption, but the premise condition was not clear. The theoretical energy requirement for thermal curing, light curing and electron beam curing was calculated and compared. The comparison of the measured values was also performed. The amount of energy required for thermal curing, UV light curing and electron beam curing was roughly 100:10:1, and the cost of energy for them was 50:5:1. In spite of the large merit of electron beam curing, it has not spread as expected, because of the repayment cost of the facility and the cost of inert gas required for the process. Energy saving is brought about by electron beam curing, but the overall cost must be examined case by case. (Kako, I.)

  7. About "Save the Children Committee (India)".

    Science.gov (United States)

    1996-01-01

    This article describes the activity among charitable committees to provide education and shelter to orphans and homeless children in India. "Save The Children Committee" of the All India Women's Conference began operations during the Bengal famine of 1943 by providing shelter to children who were homeless or did not know where their parents were. The Bengal Relief Committee also provided shelters, which later became Children's Homes, which were operated by the Save The Children Committee. Funding support for the homes came from individual donors and organizations. The Bengal government provided Rs.25/month/child for 450 children. Children's homes were set up in Phola, Mymensingh, and Brahmanberia, in the present day Bangladesh, and in Bankura. The Committee took over homes in Mahishadal, Khagda, and Belbeni. After 1948, the Children's Homes in East Pakistan were transferred to India. In 1952, several Children's Committees merged. Funds were supplied by international organizations. Government support levels varied over time. Schools for orphans changed from an emphasis on self-reliance and work to ordinary schooling. Brief descriptions are provided for homes at Pifa, Mangalgunge in Bongaon Subdivision, Thakurpukur in 24-Parganas, and Khagda in Midnapore district. For example, the home at Khagda was begun by the Bengal Relief Committee at the time of the famine of 1944. Save The Children Committee took over its operations in 1946. It is now a home for 21 boys. The boys have access to a good high school, have achieved academically, and received respect from the community.

  8. Energy saving potential in existing industrial compressors

    International Nuclear Information System (INIS)

    Vittorini, Diego; Cipollone, Roberto

    2016-01-01

    The Compressed Air Sector accounts for a mean 10% worldwide electricity consumption, which ensures about its importance, when energy saving and CO_2 emissions reduction are in question. Since the compressors alone account for 15% overall industry electricity consumption, it appears vital to pay attention to machine performances. The paper presents an overview of present compressor technology and focuses on saving directions for screw and sliding vanes machines, according to data provided by the Compressed Air and Gas Institute and PNEUROP. Data were processed to obtain consistency with fixed reference pressures and organized as a function of main operating parameters. Each sub-term, contributing to the overall efficiency (adiabatic, volumetric, mechanical, electric, organic), was considered separately: the analysis showed that the thermodynamic improvement during compression achievable by splitting the compression in two stages, with a lower compression ratio, opens the way to significantly reduce the energy specific consumption. - Highlights: • Compressors technology overview in industrial compressed air systems. • Market compressors efficiency baseline definition. • Energy breakdown and evaluation of main efficiency terms. • Assessment of air cooling-related energy saving potential. • Energy specific consumption reduction through dual stage compression.

  9. Good practice in saving energy at school

    Science.gov (United States)

    Veronesi, Paola; Bonazzi, Enrico

    2014-05-01

    We teach students between 14 and 18 years old at a high school in Italy. In the first class, one of the topics we treat is related to the atmosphere. The students learn the composition of air, the importance of the natural greenhouse effect in keeping the average temperature of the planet and how human activity is increasing the level of greenhouse gases, enhancing greenhouse effect and causing global warming. It is possible to reach this knowledge using different materials and methods such as schoolbooks, articles, websites or films, individual or group work, but as students gradually become aware of the problem of climate change due to global warming, it is necessary to propose a solution that can be experienced and measured by students. This is the aim of the project "Switch off the light, to switch on the future". The project doesn't need special materials to be carried out but all the people in the community who work and "live" at school should participate in it. The project deals directly with saving electric energy, by changing the habits of the use of electricity. Saving electric energy means saving CO2 emitted to atmosphere, and consequently contributing to the reduction of greenhouse gases emission. Normally, lights in the school are switched on in the early morning and switched off at the end of lessons. Nobody is responsible to turn out the lights in classes, so students choose one or two "Light guardians" who are responsible for the light management. Simple rules for light management are written and distributed in the classes so that the action of saving energy is spread all over the school. One class participates in the daily data collection from the electricity meter, before and after the beginning of the action. At the end of the year the data are treated and presented to the community, verifying if the electric consumption has been cut down or not. This presentation is public, with students who directly introduce collected data, results and

  10. Hippocampus discovery First steps

    Directory of Open Access Journals (Sweden)

    Eliasz Engelhardt

    Full Text Available The first steps of the discovery, and the main discoverers, of the hippocampus are outlined. Arantius was the first to describe a structure he named "hippocampus" or "white silkworm". Despite numerous controversies and alternate designations, the term hippocampus has prevailed until this day as the most widely used term. Duvernoy provided an illustration of the hippocampus and surrounding structures, considered the first by most authors, which appeared more than one and a half century after Arantius' description. Some authors have identified other drawings and texts which they claim predate Duvernoy's depiction, in studies by Vesalius, Varolio, Willis, and Eustachio, albeit unconvincingly. Considering the definition of the hippocampal formation as comprising the hippocampus proper, dentate gyrus and subiculum, Arantius and Duvernoy apparently described the gross anatomy of this complex. The pioneering studies of Arantius and Duvernoy revealed a relatively small hidden formation that would become one of the most valued brain structures.

  11. The modern water-saving agricultural technology: Progress and focus

    African Journals Online (AJOL)

    GREGORY

    2010-09-13

    Sep 13, 2010 ... saving agricultural technology, which include modern biological water-saving technology, unconventional ... and innovation, water, nutrient migration theory, regula- .... urban sewage of more than 50%; Mexico City, 90% of.

  12. INFORMATION TECHNOLOGIES IN MANAGEMENT OF ENERGY SAVING PROJECTS

    Directory of Open Access Journals (Sweden)

    Дмитро Валерійович МАРГАСОВ

    2015-06-01

    Full Text Available The information technology structure is considered of energy saving projects. The project management diagram of energy saving projects is developed, using GIS, ICS, BIM and other control and visual systems.

  13. Dynamic analysis of savings and economic growth in Nigeria ...

    African Journals Online (AJOL)

    Dynamic analysis of savings and economic growth in Nigeria. ... a trivariate dynamic Granger causality model with savings, economic growth and foreign ... It is recommended that in the short run, policies in Nigeria should be geared towards ...

  14. Energy saving and consumption reducing evaluation of thermal power plant

    Science.gov (United States)

    Tan, Xiu; Han, Miaomiao

    2018-03-01

    At present, energy saving and consumption reduction require energy saving and consumption reduction measures for thermal power plant, establishing an evaluation system for energy conservation and consumption reduction is instructive for the whole energy saving work of thermal power plant. By analysing the existing evaluation system of energy conservation and consumption reduction, this paper points out that in addition to the technical indicators of power plant, market activities should also be introduced in the evaluation of energy saving and consumption reduction in power plant. Ttherefore, a new evaluation index of energy saving and consumption reduction is set up and the example power plant is calculated in this paper. Rresults show that after introducing the new evaluation index of energy saving and consumption reduction, the energy saving effect of the power plant can be judged more comprehensively, so as to better guide the work of energy saving and consumption reduction in power plant.

  15. Cost-saving production technologies and partial ownership

    OpenAIRE

    Juan Carlos Barcena-Ruiz; Norma Olaizola

    2007-01-01

    This work analyzes the incentives to acquire cost-saving production technologies when cross-participation exists at ownership level. We show that cross-participation reduces the incentives to adopt the cost-saving production technology.

  16. Chemical pre-processing of cluster galaxies over the past 10 billion years in the IllustrisTNG simulations

    Science.gov (United States)

    Gupta, Anshu; Yuan, Tiantian; Torrey, Paul; Vogelsberger, Mark; Martizzi, Davide; Tran, Kim-Vy H.; Kewley, Lisa J.; Marinacci, Federico; Nelson, Dylan; Pillepich, Annalisa; Hernquist, Lars; Genel, Shy; Springel, Volker

    2018-06-01

    We use the IllustrisTNG simulations to investigate the evolution of the mass-metallicity relation (MZR) for star-forming cluster galaxies as a function of the formation history of their cluster host. The simulations predict an enhancement in the gas-phase metallicities of star-forming cluster galaxies (109 cluster galaxies appears prior to their infall into the central cluster potential, indicating for the first time a systematic `chemical pre-processing' signature for infalling cluster galaxies. Namely, galaxies that will fall into a cluster by z = 0 show a ˜0.05 dex enhancement in the MZR compared to field galaxies at z ≤ 0.5. Based on the inflow rate of gas into cluster galaxies and its metallicity, we identify that the accretion of pre-enriched gas is the key driver of the chemical evolution of such galaxies, particularly in the stellar mass range (109 clusters. Our results motivate future observations looking for pre-enrichment signatures in dense environments.

  17. Forecasting of a ground-coupled heat pump performance using neural networks with statistical data weighting pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Esen, Hikmet; Esen, Mehmet [Department of Mechanical Education, Faculty of Technical Education, Firat University, 23119 Elazig (Turkey); Inalli, Mustafa [Department of Mechanical Engineering, Faculty of Engineering, Firat University, 23279 Elazig (Turkey); Sengur, Abdulkadir [Department of Electronic and Computer Science, Faculty of Technical Education, Firat University, 23119 Elazig (Turkey)

    2008-04-15

    The objective of this work is to improve the performance of an artificial neural network (ANN) with a statistical weighted pre-processing (SWP) method to learn to predict ground source heat pump (GCHP) systems with the minimum data set. Experimental studies were completed to obtain training and test data. Air temperatures entering/leaving condenser unit, water-antifreeze solution entering/leaving the horizontal ground heat exchangers and ground temperatures (1 and 2 m) were used as input layer, while the output is coefficient of performance (COP) of system. Some statistical methods, such as the root-mean squared (RMS), the coefficient of multiple determinations (R{sup 2}) and the coefficient of variation (cov) is used to compare predicted and actual values for model validation. It is found that RMS value is 0.074, R{sup 2} value is 0.9999 and cov value is 2.22 for SCG6 algorithm of only ANN structure. It is also found that RMS value is 0.002, R{sup 2} value is 0.9999 and cov value is 0.076 for SCG6 algorithm of SWP-ANN structure. The simulation results show that the SWP based networks can be used an alternative way in these systems. Therefore, instead of limited experimental data found in literature, faster and simpler solutions are obtained using hybridized structures such as SWP-ANN. (author)

  18. [Influence of Spectral Pre-Processing on PLS Quantitative Model of Detecting Cu in Navel Orange by LIBS].

    Science.gov (United States)

    Li, Wen-bing; Yao, Lin-tao; Liu, Mu-hua; Huang, Lin; Yao, Ming-yin; Chen, Tian-bing; He, Xiu-wen; Yang, Ping; Hu, Hui-qin; Nie, Jiang-hui

    2015-05-01

    Cu in navel orange was detected rapidly by laser-induced breakdown spectroscopy (LIBS) combined with partial least squares (PLS) for quantitative analysis, then the effect on the detection accuracy of the model with different spectral data ptetreatment methods was explored. Spectral data for the 52 Gannan navel orange samples were pretreated by different data smoothing, mean centralized and standard normal variable transform. Then 319~338 nm wavelength section containing characteristic spectral lines of Cu was selected to build PLS models, the main evaluation indexes of models such as regression coefficient (r), root mean square error of cross validation (RMSECV) and the root mean square error of prediction (RMSEP) were compared and analyzed. Three indicators of PLS model after 13 points smoothing and processing of the mean center were found reaching 0. 992 8, 3. 43 and 3. 4 respectively, the average relative error of prediction model is only 5. 55%, and in one word, the quality of calibration and prediction of this model are the best results. The results show that selecting the appropriate data pre-processing method, the prediction accuracy of PLS quantitative model of fruits and vegetables detected by LIBS can be improved effectively, providing a new method for fast and accurate detection of fruits and vegetables by LIBS.

  19. A New Hybrid Model Based on Data Preprocessing and an Intelligent Optimization Algorithm for Electrical Power System Forecasting

    Directory of Open Access Journals (Sweden)

    Ping Jiang

    2015-01-01

    Full Text Available The establishment of electrical power system cannot only benefit the reasonable distribution and management in energy resources, but also satisfy the increasing demand for electricity. The electrical power system construction is often a pivotal part in the national and regional economic development plan. This paper constructs a hybrid model, known as the E-MFA-BP model, that can forecast indices in the electrical power system, including wind speed, electrical load, and electricity price. Firstly, the ensemble empirical mode decomposition can be applied to eliminate the noise of original time series data. After data preprocessing, the back propagation neural network model is applied to carry out the forecasting. Owing to the instability of its structure, the modified firefly algorithm is employed to optimize the weight and threshold values of back propagation to obtain a hybrid model with higher forecasting quality. Three experiments are carried out to verify the effectiveness of the model. Through comparison with other traditional well-known forecasting models, and models optimized by other optimization algorithms, the experimental results demonstrate that the hybrid model has the best forecasting performance.

  20. Haralick texture features from apparent diffusion coefficient (ADC) MRI images depend on imaging and pre-processing parameters.

    Science.gov (United States)

    Brynolfsson, Patrik; Nilsson, David; Torheim, Turid; Asklund, Thomas; Karlsson, Camilla Thellenberg; Trygg, Johan; Nyholm, Tufve; Garpebring, Anders

    2017-06-22

    In recent years, texture analysis of medical images has become increasingly popular in studies investigating diagnosis, classification and treatment response assessment of cancerous disease. Despite numerous applications in oncology and medical imaging in general, there is no consensus regarding texture analysis workflow, or reporting of parameter settings crucial for replication of results. The aim of this study was to assess how sensitive Haralick texture features of apparent diffusion coefficient (ADC) MR images are to changes in five parameters related to image acquisition and pre-processing: noise, resolution, how the ADC map is constructed, the choice of quantization method, and the number of gray levels in the quantized image. We found that noise, resolution, choice of quantization method and the number of gray levels in the quantized images had a significant influence on most texture features, and that the effect size varied between different features. Different methods for constructing the ADC maps did not have an impact on any texture feature. Based on our results, we recommend using images with similar resolutions and noise levels, using one quantization method, and the same number of gray levels in all quantized images, to make meaningful comparisons of texture feature results between different subjects.

  1. Astronomical sketching a step-by-step introduction

    CERN Document Server

    Handy, Richard; Perez, Jeremy; Rix, Erika; Robbins, Sol

    2007-01-01

    This book presents the amateur with fine examples of astronomical sketches and step-by-step tutorials in each medium, from pencil to computer graphics programs. This unique book can teach almost anyone to create beautiful sketches of celestial objects.

  2. Financial development and domestic savings in emerging Asian countries

    OpenAIRE

    Yılmaz BAYAR

    2014-01-01

    Saving is one of the important determinants of economic growth. Therefore determinants of saving are indirectly important for a sustainable economic growth. Financial sector has come into prominence as a possible determinant of saving in the globalized financial markets. This study examines the relationship between gross domestic savings and financial development in the emerging Asian countries during the period 1992-2011 by using panel regression. We found that financial de...

  3. Entrepreneurial Saving Practices and Reinvestment : Theory and Evidence

    NARCIS (Netherlands)

    Beck, T.H.L.; Pamuk, H.; Uras, R.B.

    2014-01-01

    We use a novel enterprise survey from Tanzania to gauge the relationship between saving instruments and entrepreneurial reinvestment. While most informal savings practices do not imply a lower likelihood of entrepreneurial reinvestment when compared with formal savings practices, we find a

  4. Saving in Childhood and Adolescence: Insights from Developmental Psychology

    Science.gov (United States)

    Otto, Annette

    2013-01-01

    This paper addresses variables related to child and adolescent saving and explains the development of skills and behaviors that facilitate saving from an economic socialization perspective. References are made to the differences between the economic world of children, adolescents, and adults as well as to existing theories of saving. Children's…

  5. Casual relationship between gross domestic saving and economic ...

    African Journals Online (AJOL)

    The empirical study confirmed that a significant relationship between domestic savings and ... and, which ultimately increases the country domestic saving level. ... which increase saving and investment into the country due to its dual effect. ... South Africa (96); South Sudan (1); Sudan (3); Swaziland (3); Tanzania (19); Togo ...

  6. Saving and Habit Formation : Evidence from Dutch Panel Data

    NARCIS (Netherlands)

    Alessie, R.J.M.; Teppa, F.

    2002-01-01

    This paper focuses on the role of habit formation in individual preferences over consumption and saving.We closely relate to Alessie and Lusardi's (1997) model as we estimate a model which is based on their closed-form solution, where saving is expressed as a function of lagged saving and other

  7. Causal Links among Saving, Investment and Growth and ...

    African Journals Online (AJOL)

    The relationship between saving, investment and GDP still remains an empirical issue. In their aspiration to catch up the rest of the world, developing countries provides a special place on this matter. This paper tried to investigate the main determinants of saving and the connection among saving, investment and GDP in the ...

  8. Entrepreneurial saving practices and reinvestment : Theory and evidence

    NARCIS (Netherlands)

    Beck, T.H.L.; Pamuk, Haki; Uras, Burak

    2017-01-01

    We use a novel enterprise survey to gauge the relationship between saving instruments and entrepreneurial reinvestment. We show that while most informal saving practices are not associated with a lower likelihood of reinvestment when compared with formal saving practices, there is a significantly

  9. 48 CFR 48.104-3 - Sharing collateral savings.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Sharing collateral savings... CONTRACT MANAGEMENT VALUE ENGINEERING Policies and Procedures 48.104-3 Sharing collateral savings. (a) The Government shares collateral savings with the contractor, unless the head of the contracting activity has...

  10. 48 CFR 2448.104-3 - Sharing collateral savings.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 6 2010-10-01 2010-10-01 true Sharing collateral savings... DEVELOPMENT CONTRACT MANAGEMENT VALUE ENGINEERING 2448.104-3 Sharing collateral savings. (a) The authority of the HCA to determine that the cost of calculating and tracking collateral savings will exceed the...

  11. The modern water-saving agricultural technology: Progress and focus

    African Journals Online (AJOL)

    GREGORY

    2010-09-13

    Sep 13, 2010 ... DEVELOPING TENDENCY OF MODERN WATER-. SAVING AGRICULTURAL TECHNOLOGY. Excavation of the own water-saving potential using biotechnology. The biological water-saving technology that uses crop physiology control and modern breeding techniques to increase production and water ...

  12. 76 FR 35085 - Savings and Loan Holding Company Application

    Science.gov (United States)

    2011-06-15

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Savings and Loan Holding Company... Proposal: Savings Loan Holding Company Application. OMB Number: 1550-0015. Form Numbers: H-(e). Description... that no company, or any director or officer of a savings and loan holding company, or any individual...

  13. 76 FR 20459 - Savings and Loan Holding Company Application

    Science.gov (United States)

    2011-04-12

    ... DEPARTMENT OF THE TREASURY Office of Thrift Supervision Savings and Loan Holding Company... concerning the following information collection. Title of Proposal: Savings and Loan Holding Company... officer of a savings and loan holding company, or any individual who owns, controls, or holds with power...

  14. The EU must triple its energy saving policy effect

    NARCIS (Netherlands)

    Wesselink, B.; Eichhammer, W.; Harmsen, R.

    2010-01-01

    The impact of EU energy savings policy must triple to achieve the bloc’s 2020 energy savings goal. But such efforts could get a much better foundation if European leaders set a binding energy consumption target, rather than the current indicative savings target. The evidence for such

  15. Canada Education Savings Program: Annual Statistical Review--2009

    Science.gov (United States)

    Human Resources and Skills Development Canada, 2009

    2009-01-01

    The Canada Education Savings Program is an initiative of the Government of Canada. As part of the Department of Human Resources and Skills Development, the program administers the Canada Education Savings Grant and the Canada Learning Bond. These two initiatives help Canadian families save for a child's post-secondary education in Registered…

  16. Canada Education Savings Program: Annual Statistical Review 2011

    Science.gov (United States)

    Human Resources and Skills Development Canada, 2011

    2011-01-01

    The Canada Education Savings Program has been an initiative of the Government of Canada since 1998. As part of the Department of Human Resources and Skills Development, the program administers the Canada Education Savings Grant and the Canada Learning Bond. These two initiatives help Canadian families save for a child's post-secondary education in…

  17. Canada Education Savings Program: Annual Statistical Review 2012

    Science.gov (United States)

    Human Resources and Skills Development Canada, 2012

    2012-01-01

    The Canada Education Savings Program (CESP) has been an initiative of the Government of Canada since 1998. As part of the Department of Human Resources and Skills Development Canada, the program administers the Canada Education Savings Grant (CESG) and the Canada Learning Bond (CLB). These two initiatives help Canadian families save for a child's…

  18. Building Savings and Investments Culture among Nigerians | Imegi ...

    African Journals Online (AJOL)

    It is therefore necessary to build among Nigerians savings and investments culture. A review of extant literature revealed that people save and invest for several reasons among which are to enhance the standard of living, take advantage of rare business opportunities, and meet unforeseen circumstances. Savings can be ...

  19. Water saving through international trade of agricultural products

    NARCIS (Netherlands)

    Chapagain, Ashok; Hoekstra, Arjen Ysbert; Savenije, H.H.G.

    2006-01-01

    Many nations save domestic water resources by importing water-intensive products and exporting commodities that are less water intensive. National water saving through the import of a product can imply saving water at a global level if the flow is from sites with high to sites with low water

  20. STEP and fundamental physics

    Science.gov (United States)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-09-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 1013 to one part in 1018 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels.

  1. STEP and fundamental physics

    International Nuclear Information System (INIS)

    Overduin, James; Everitt, Francis; Worden, Paul; Mester, John

    2012-01-01

    The Satellite Test of the Equivalence Principle (STEP) will advance experimental limits on violations of Einstein's equivalence principle from their present sensitivity of two parts in 10 13 to one part in 10 18 through multiple comparison of the motions of four pairs of test masses of different compositions in a drag-free earth-orbiting satellite. We describe the experiment, its current status and its potential implications for fundamental physics. Equivalence is at the heart of general relativity, our governing theory of gravity and violations are expected in most attempts to unify this theory with the other fundamental interactions of physics, as well as in many theoretical explanations for the phenomenon of dark energy in cosmology. Detection of such a violation would be equivalent to the discovery of a new force of nature. A null result would be almost as profound, pushing upper limits on any coupling between standard-model fields and the new light degrees of freedom generically predicted by these theories down to unnaturally small levels. (paper)

  2. One-step microlithography

    Science.gov (United States)

    Kahlen, Franz-Josef; Sankaranarayanan, Srikanth; Kar, Aravinda

    1997-09-01

    Subject of this investigation is a one-step rapid machining process to create miniaturized 3D parts, using the original sample material. An experimental setup where metal powder is fed to the laser beam-material interaction region has been built. The powder is melted and forms planar, 2D geometries as the substrate is moved under the laser beam in XY- direction. After completing the geometry in the plane, the substrate is displaced in Z-direction, and a new layer of material is placed on top of the just completed deposit. By continuous repetition of this process, 3D parts wee created. In particular, the impact of the focal spot size of the high power laser beam on the smallest achievable structures was investigated. At a translation speed of 51 mm/s a minimum material thickness of 590 micrometers was achieved. Also, it was shown that a small Z-displacement has a negligible influence on the continuity of the material deposition over this power range. A high power CO2 laser was used as energy source, the material powder under investigation was stainless steel SS304L. Helium was used as shield gas at a flow rate of 15 1/min. The incident CO2 laser beam power was varied between 300 W and 400 W, with the laser beam intensity distribute in a donut mode. The laser beam was focused to a focal diameter of 600 (Mu) m.

  3. Step 1: Learn about Diabetes

    Science.gov (United States)

    ... please turn JavaScript on. Feature: Type 2 Diabetes Step 1: Learn About Diabetes Past Issues / Fall 2014 ... the whole family healthy! Here are four key steps to help you control your diabetes and live ...

  4. Saving Money and Time with Virtual Server

    CERN Document Server

    Sanders, Chris

    2006-01-01

    Microsoft Virtual Server 2005 consistently proves to be worth its weight in gold, with new implementations thought up every day. With this product now a free download from Microsoft, scores of new users are able to experience what the power of virtualization can do for their networks. This guide is aimed at network administrators who are interested in ways that Virtual Server 2005 can be implemented in their organizations in order to save money and increase network productivity. It contains information on setting up a virtual network, virtual consolidation, virtual security, virtual honeypo

  5. Small scale models equal large scale savings

    International Nuclear Information System (INIS)

    Lee, R.; Segroves, R.

    1994-01-01

    A physical scale model of a reactor is a tool which can be used to reduce the time spent by workers in the containment during an outage and thus to reduce the radiation dose and save money. The model can be used for worker orientation, and for planning maintenance, modifications, manpower deployment and outage activities. Examples of the use of models are presented. These were for the La Salle 2 and Dresden 1 and 2 BWRs. In each case cost-effectiveness and exposure reduction due to the use of a scale model is demonstrated. (UK)

  6. Future cost savings from engineering innovations

    International Nuclear Information System (INIS)

    Roemer, R.E.; Foster, D.C.; Jacobs, S.B.

    1987-01-01

    Nuclear power plant design and operating experience in the 1970s and 1980s continues to provide feedback to the technology base. The lessons learned in these two decades, coupled with engineering innovation, will lead to improvements and cost-reductions in the plants of the 1990s. Two types of innovations related to piping are described: snubber reduction and pipe rupture elimination. A brief account of the industry experience is given for each, followed by an account of the technical methodology involved. A discussion of expected benefits, including cost savings of millions of dollars (U.S.), is provided. (author)

  7. Food industry hungry for energy savings

    Energy Technology Data Exchange (ETDEWEB)

    Blackburn, D

    1989-04-01

    The United Kingdom food and drink industry is a significant user of energy. Energy use figures are given showing the breakdown in terms of different sectors of the industry and also in terms of the fuel used. Four energy monitoring and target setting demonstration projects are outlined at factories typical of their type in different sectors. The projects have resulted in a much greater awareness by management in the factories involved of energy consumption and waste. Examples are given of improved energy efficiency and consequent energy savings which have resulted from this awareness. (U.K.).

  8. The energy saving manual; Das Energiesparbuch

    Energy Technology Data Exchange (ETDEWEB)

    Goetze, Monika; Pinn, Gudrun

    2009-07-01

    The constant increase in the cost of electric power, petroleum, gasoline and gas burden the household budget. At the same time, greenhouse gases are speeding up global warming alarmingly. It is time to reconsider our style of living. This guide presents simple and practical hints to reduce energy costs and protect the climate. Exemplary calculations show how up to 1100 Euro can be saved per household member and per annum. Subjects: Identifying fields of excess energy consumption; No wasting of electricity, heat, and warm water; Food, shopping and climate; Mobility and climate protection; How to establish an individual energy conservation profile. (orig.)

  9. Saving Money or Spending Tomorrow's Money

    Institute of Scientific and Technical Information of China (English)

    罗芳梅

    2017-01-01

    Chinese are normally believed to be thrifty.However,economic development has had a tremendous impact upon Chinese society,uprooting the long-engraved ideas.With the emergence of the credit cards,spending tomorrow's money becomes a reality.In this way,people are in dilemma:saving money or spending tomorrow's money.Firstly,this paper focuses on the benefits of spending tomorrow's money.Secondly,it shows that spending tomorrow's money is confronted with many challenges.Finally,the paper comes up with some suggestions to solve these problems.

  10. SPAR-H Step-by-Step Guidance

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Dana L. Kelly; Ronald L. Boring; William J. Galyean

    2012-06-01

    Step-by-step guidance was developed recently at Idaho National Laboratory for the US Nuclear Regulatory Commission on the use of the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) method for quantifying Human Failure Events (HFEs). This work was done to address SPAR-H user needs, specifically requests for additional guidance on the proper application of various aspects of the methodology. This paper overviews the steps of the SPAR-H analysis process and highlights some of the most important insights gained during the development of the step-by-step directions. This supplemental guidance for analysts is applicable when plant-specific information is available, and goes beyond the general guidance provided in existing SPAR-H documentation. The steps highlighted in this paper are: Step-1, Categorizing the HFE as Diagnosis and/or Action; Step-2, Rate the Performance Shaping Factors; Step-3, Calculate PSF-Modified HEP; Step-4, Accounting for Dependence, and; Step-5, Minimum Value Cutoff.

  11. Does extending daylight saving time save energy? Evidence from an Australian experiment

    Energy Technology Data Exchange (ETDEWEB)

    Kellogg, R. [California Univ., Berkeley, CA (United States). Dept. of Agricultural and Resource Economics; Wolff, H. [California Univ., Berkeley, CA (United States). Dept. of Agricultural and Resource Economics]|[Forschungsinstitut zur Zukunft der Arbeit (IZA), Bonn (Germany)

    2007-03-15

    Several countries are considering extending Daylight Saving Time (DST) in order to conserve energy, and the U.S. will extend DST by one month beginning in 2007. However, projections that these extensions will reduce electricity consumption rely on extrapolations and simulations rather than empirical evidence. This paper, in contrast, examines a quasiexperiment in which parts of Australia extended DST in 2000 to facilitate the Sydney Olympics. Using detailed panel data and a triple differences specification, we show that the extension did not conserve electricity, and that a prominent simulation model overstates electricity savings when it is applied to Australia. (orig.)

  12. Multiple stage miniature stepping motor

    International Nuclear Information System (INIS)

    Niven, W.A.; Shikany, S.D.; Shira, M.L.

    1981-01-01

    A stepping motor comprising a plurality of stages which may be selectively activated to effect stepping movement of the motor, and which are mounted along a common rotor shaft to achieve considerable reduction in motor size and minimum diameter, whereby sequential activation of the stages results in successive rotor steps with direction being determined by the particular activating sequence followed

  13. One-Step Dynamic Classifier Ensemble Model for Customer Value Segmentation with Missing Values

    Directory of Open Access Journals (Sweden)

    Jin Xiao

    2014-01-01

    Full Text Available Scientific customer value segmentation (CVS is the base of efficient customer relationship management, and customer credit scoring, fraud detection, and churn prediction all belong to CVS. In real CVS, the customer data usually include lots of missing values, which may affect the performance of CVS model greatly. This study proposes a one-step dynamic classifier ensemble model for missing values (ODCEM model. On the one hand, ODCEM integrates the preprocess of missing values and the classification modeling into one step; on the other hand, it utilizes multiple classifiers ensemble technology in constructing the classification models. The empirical results in credit scoring dataset “German” from UCI and the real customer churn prediction dataset “China churn” show that the ODCEM outperforms four commonly used “two-step” models and the ensemble based model LMF and can provide better decision support for market managers.

  14. Estimating the energy saving potential of telecom operators in China

    International Nuclear Information System (INIS)

    Yang, Tian-Jian; Zhang, Yue-Jun; Huang, Jin; Peng, Ruo-Hong

    2013-01-01

    A set of models are employed to estimate the potential of total energy saved of productions and segmented energy saving for telecom operators in China. During the estimation, the total energy saving is divided into that by technology and management, which are derived from technical reform and progress, and management control measures and even marketing respectively, and the estimating methodologies for energy saving potential of each segment are elaborated. Empirical results from China Mobile indicate that, first, the technical advance in communications technology accounts for the largest proportion (70%–80%) of the total energy saved of productions in telecom sector of China. Second, technical reform brings about 20%–30% of the total energy saving. Third, the proportions of energy saving brought by marketing and control measures appear relatively smaller, just less than 3%. Therefore, China's telecom operators should seize the opportunity of the revolution of communications network techniques in recent years to create an advanced network with lower energy consumption

  15. Development of an Energy-Savings Calculation Methodology for Residential Miscellaneous Electric Loads: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Hendron, R.; Eastment, M.

    2006-08-01

    In order to meet whole-house energy savings targets beyond 50% in residential buildings, it will be essential that new technologies and systems approaches be developed to address miscellaneous electric loads (MELs). These MELs are comprised of the small and diverse collection of energy-consuming devices found in homes, including what are commonly known as plug loads (televisions, stereos, microwaves), along with all hard-wired loads that do not fit into other major end-use categories (doorbells, security systems, garage door openers). MELs present special challenges because their purchase and operation are largely under the control of the occupants. If no steps are taken to address MELs, they can constitute 40-50% of the remaining source energy use in homes that achieve 60-70% whole-house energy savings, and this percentage is likely to increase in the future as home electronics become even more sophisticated and their use becomes more widespread. Building America (BA), a U.S. Department of Energy research program that targets 50% energy savings by 2015 and 90% savings by 2025, has begun to identify and develop advanced solutions that can reduce MELs.

  16. Technical Support Document: Strategies for 50% Energy Savings in Large Office Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Leach, M.; Lobato, C.; Hirsch, A.; Pless, S.; Torcellini, P.

    2010-09-01

    This Technical Support Document (TSD) documents technical analysis that informs design guidance for designing and constructing large office buildings that achieve 50% net site energy savings over baseline buildings defined by minimal compliance with respect to ANSI/ASHRAE/IESNA Standard 90.1-2004. This report also represents a step toward developing a methodology for using energy modeling in the design process to achieve aggressive energy savings targets. This report documents the modeling and analysis methods used to identify design recommendations for six climate zones that capture the range of U.S. climate variability; demonstrates how energy savings change between ASHRAE Standard 90.1-2007 and Standard 90.1-2004 to determine baseline energy use; uses a four-story 'low-rise' prototype to analyze the effect of building aspect ratio on energy use intensity; explores comparisons between baseline and low-energy building energy use for alternate energy metrics (net source energy, energy emissions, and energy cost); and examines the extent to which glass curtain construction limits achieve energy savings by using a 12-story 'high-rise' prototype.

  17. Defining a standard metric for electricity savings

    International Nuclear Information System (INIS)

    Koomey, Jonathan; Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve

    2010-01-01

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO 2 per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  18. Defining a standard metric for electricity savings

    Energy Technology Data Exchange (ETDEWEB)

    Koomey, Jonathan [Lawrence Berkeley National Laboratory and Stanford University, PO Box 20313, Oakland, CA 94620-0313 (United States); Akbari, Hashem; Blumstein, Carl; Brown, Marilyn; Brown, Richard; Calwell, Chris; Carter, Sheryl; Cavanagh, Ralph; Chang, Audrey; Claridge, David; Craig, Paul; Diamond, Rick; Eto, Joseph H; Fulkerson, William; Gadgil, Ashok; Geller, Howard; Goldemberg, Jose; Goldman, Chuck; Goldstein, David B; Greenberg, Steve, E-mail: JGKoomey@stanford.ed

    2010-01-15

    The growing investment by governments and electric utilities in energy efficiency programs highlights the need for simple tools to help assess and explain the size of the potential resource. One technique that is commonly used in this effort is to characterize electricity savings in terms of avoided power plants, because it is easier for people to visualize a power plant than it is to understand an abstraction such as billions of kilowatt-hours. Unfortunately, there is no standardization around the characteristics of such power plants. In this letter we define parameters for a standard avoided power plant that have physical meaning and intuitive plausibility, for use in back-of-the-envelope calculations. For the prototypical plant this article settles on a 500 MW existing coal plant operating at a 70% capacity factor with 7% T and D losses. Displacing such a plant for one year would save 3 billion kWh/year at the meter and reduce emissions by 3 million metric tons of CO{sub 2} per year. The proposed name for this metric is the Rosenfeld, in keeping with the tradition among scientists of naming units in honor of the person most responsible for the discovery and widespread adoption of the underlying scientific principle in question-Dr Arthur H Rosenfeld.

  19. Electric power production contra electricity savings

    International Nuclear Information System (INIS)

    Schleisner, L.; Grohnheit, P.E.; Soerensen, H.

    1991-01-01

    The expansion of electricity-producing plants has, in Denmark until now, taken place in accordance with the demand for electricity. Recently, it has been suggested that the cost of the further development of such systems is greater than the cost of instigating and carrying out energy conservation efforts. The aim of the project was to evaluate the consequences for power producing plants of a reduction of the electricity consumption of end-users. A method for the analysis of the costs involved in the system and operation of power plants contra the costs that are involved in saving electricity is presented. In developing a model of this kind, consideration is given to the interplay of the individual saving project and the existing or future electricity supply. Thus it can be evaluated to what extent it would be advisable to substitute investments in the development of the capacity of the power plants with investments in the reduction of electricity consumption by the end users. This model is described in considerable detail. It will be tested in representative situations and locations throughout the Nordic countries. (AB) 17 refs

  20. Energy saving screw compressor technology; Energiebesparende schroefcompressortechnologie

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, A. [RefComp, Lonigo (Italy); Neus, M. [Delta Technics Engineering, Breda (Netherlands)

    2011-03-15

    Smart solutions to reduce the energy consumption are continuously part of investigation in the refrigeration technology. This article subscribed the technology on which way energy can be saved at the operation of screw compressors which are used in air conditioners and refrigerating machinery. The combination of frequency control and Vi-control (intrinsic volumetric ratio) such as researched in the laboratory of RefComp is for the user attractive because the energy efficiency during part load operation is much better. Smart uses of thermodynamics, electric technology and electronic control are the basics of these applications. According to the manufacturer's information it is possible with these new generation screw compressors to save approx. 26% energy in comparison with the standard screw compressor. [Dutch] In dit artikel wordt de technologie omschreven waarmee veel energie bespaard kan worden bij schroefcompressoren die worden gebruikt in airconditioningsystemen en koel- en vriesinstallaties. De combinatie van frequentieregeling en Vi- regeling (Vi is de intrinsieke volumetrische verhouding) zoals onderzocht in het laboratorium van RefComp biedt de gebruiker veel voordelen doordat de energie-efficintie van de compressor tijdens deellast enorm wordt verbeterd. Slim gebruik van thermodynamika, elektrotechniek en elektronica vormen de basis van deze toepassing. Volgens de fabrikant kan met deze nieuwe generatie schroefcompressoren circa 26 procent op het energiegebruik tijdens deellast worden bespaard in vergelijking met de standaard serie schroefcompressoren.