WorldWideScience

Sample records for preprocessing subsystem cps

  1. A Selective CPS Transformation

    DEFF Research Database (Denmark)

    Nielsen, Lasse Riechstein

    2001-01-01

    The CPS transformation makes all functions continuation-passing, uniformly. Not all functions, however, need continuations: they only do if their evaluation includes computational effects. In this paper we focus on control operations, in particular "call with current continuation" and "throw". We...... characterize this involvement as a control effect and we present a selective CPS transformation that makes functions and expressions continuation-passing if they have a control effect, and that leaves the rest of the program in direct style. We formalize this selective CPS transformation with an operational...

  2. A Selective CPS Transformation

    DEFF Research Database (Denmark)

    Nielsen, Lasse Riechstein

    2001-01-01

    The CPS transformation makes all functions continuation-passing, uniformly. Not all functions, however, need continuations: they only do if their evaluation includes computational effects. In this paper we focus on control operations, in particular "call with current continuation" and "throw". We...... characterize this involvement as a control effect and we present a selective CPS transformation that makes functions and expressions continuation-passing if they have a control effect, and that leaves the rest of the program in direct style. We formalize this selective CPS transformation with an operational...

  3. CHIP Reporting in the CPS

    Data.gov (United States)

    U.S. Department of Health & Human Services — CHIP reporting in the CPS is unreliable. Only 10 to 30 percent of those with CHIP (but not Medicaid) report this type of coverage in the CPS. Many with CHIP report...

  4. Subsystem ETH

    CERN Document Server

    Dymarsky, Anatoly; Liu, Hong

    2016-01-01

    Motivated by the qualitative picture of Canonical Typicality, we propose a refined formulation of the Eigenstate Thermalization Hypothesis (ETH) for chaotic quantum systems. The new formulation, which we refer to as subsystem ETH, is in terms of the reduced density matrix of subsystems. This strong form of ETH clarifies which set of observables defined within the subsystem will thermalize. We discuss the limits when the size of the subsystem is small or comparable to its complement. Finally, we provide numerical evidence for the proposal in case of one-dimensional Ising spin-chain.

  5. LANDSAT data preprocessing

    Science.gov (United States)

    Austin, W. W.

    1983-01-01

    The effect on LANDSAT data of a Sun angle correction, an intersatellite LANDSAT-2 and LANDSAT-3 data range adjustment, and the atmospheric correction algorithm was evaluated. Fourteen 1978 crop year LACIE sites were used as the site data set. The preprocessing techniques were applied to multispectral scanner channel data and transformed data were plotted and used to analyze the effectiveness of the preprocessing techniques. Ratio transformations effectively reduce the need for preprocessing techniques to be applied directly to the data. Subtractive transformations are more sensitive to Sun angle and atmospheric corrections than ratios. Preprocessing techniques, other than those applied at the Goddard Space Flight Center, should only be applied as an option of the user. While performed on LANDSAT data the study results are also applicable to meteorological satellite data.

  6. CPS Transformation of Beta-Redexes

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse

    2005-01-01

    The extra compaction of the most compacting CPS transformation in existence, which is due to Sabry and Felleisen, is generally attributed to (1) making continuations occur first in CPS terms and (2) classifying more redexes as administrative. We show that this extra compaction is actually...... independent of the relative positions of values and continuations and furthermore that it is solely due to a context-sensitive transformation of beta-redexes. We stage the more compact CPS transformation into a first-order uncurrying phase and a context-insensitive CPS transformation. We also define a context......-insensitive CPS transformation that provides the extra compaction. This CPS transformation operates in one pass and is dependently typed....

  7. CPS Transformation of Beta-Redexes

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse R.

    2000-01-01

    The extra compaction of the most compacting CPS transformation in existence, which is due to Sabry and Felleisen, is generally attributed to (1) making continuations occur first in CPS terms and (2) classifying more redexes as administrative. We show that this extra compaction is actually...... independent of the relative positions of values and continuations and furthermore that it is solely due to a context-sensitive transformation of beta-redexes. We stage the more compact CPS transformation into a first-order uncurrying phase and a context-insensitive CPS transformation. We also define a context......-insensitive CPS transformation that provides the extra compaction. This CPS transformation operates in one pass and is dependently typed....

  8. CPS Transformation of Beta-Redexes

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse R.

    2000-01-01

    The extra compaction of the most compacting CPS transformation in existence, which is due to Sabry and Felleisen, is generally attributed to (1) making continuations occur first in CPS terms and (2) classifying more redexes as administrative. We show that this extra compaction is actually...... independent of the relative positions of values and continuations and furthermore that it is solely due to a context-sensitive transformation of beta-redexes. We stage the more compact CPS transformation into a first-order uncurrying phase and a context-insensitive CPS transformation. We also define a context......-insensitive CPS transformation that provides the extra compaction. This CPS transformation operates in one pass and is dependently typed....

  9. Acquisition and preprocessing of LANDSAT data

    Science.gov (United States)

    Horn, T. N.; Brown, L. E.; Anonsen, W. H. (Principal Investigator)

    1979-01-01

    The original configuration of the GSFC data acquisition, preprocessing, and transmission subsystem, designed to provide LANDSAT data inputs to the LACIE system at JSC, is described. Enhancements made to support LANDSAT -2, and modifications for LANDSAT -3 are discussed. Registration performance throughout the 3 year period of LACIE operations satisfied the 1 pixel root-mean-square requirements established in 1974, with more than two of every three attempts at data registration proving successful, notwithstanding cosmetic faults or content inadequacies to which the process is inherently susceptible. The cloud/snow rejection rate experienced throughout the last 3 years has approached 50%, as expected in most LANDSAT data use situations.

  10. An Operational Investigation of the CPS Hierarchy

    DEFF Research Database (Denmark)

    Danvy, Olivier; Yang, Zhe

    1998-01-01

    We explore the hierarchy of control induced by successive transformations into continuation-passing style (CPS) in the presence of “control delimiters ” and “composable continuations ”. Specifically, we investigate the structural operational semantics associated with the CPS hierarchy. To this end......, we characterize an operational notion of continuation semantics. We relate it to the traditional CPS transformation and we use it to account for the control operator shift and the control delimiter reset operationally. We then transcribe the resulting continuation semantics in ML, thus obtaining...

  11. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  12. CPS Transformation of Flow Information, Part II

    DEFF Research Database (Denmark)

    Damian, D.; Danvy, Olivier

    2003-01-01

    the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article ‘CPS Transformation of Flow Information.’ Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints...... of a program into the least solution of the flow constraints of the CPS counterpart of this program, after administrative reductions. Furthermore, we show how to CPS transform control-flow information in one pass....... consider the administrative reductions of a Plotkin-style transformation into Continuation-Passing Style (CPS), and how they affect the result of a constraint-based control-flow analysis and, in particular, the least element in the space of solutions. We show that administrative reductions preserve...

  13. On proving syntactic properties of CPS programs

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dzafic, Belmina; Pfenning, Frank

    1999-01-01

    Higher-order program transformations raise new challenges for proving properties of their output, since they resist traditional, first-order proof techniques. In this work, we consider (1) the “one-pass” continuation-passing style (CPS) transformation, which is second-order, and (2) the occurrenc...... of parameters of continuations in its output. To this end, we specify the one-pass CPS transformation relationally and we use the proof technique of logical relations....

  14. An Operational Investigation of the CPS Hierarchy

    DEFF Research Database (Denmark)

    Danvy, Olivier; Yang, Zhe

    1999-01-01

    We explore the hierarchy of control induced by successive transformations into continuation-passing style (CPS) in the presence of “control delimiters ” and “composable continuations ”. Specifically, we investigate the structural operational semantics associated with the CPS hierarchy. To this en...... a native and modular implementation of the entire hierarchy. We illustrate it with several examples, the most significant of which is layered monads....

  15. On proving syntactic properties of CPS programs

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dzafic, Belmina; Pfenning, Frank

    1999-01-01

    Higher-order program transformations raise new challenges for proving properties of their output, since they resist traditional, first-order proof techniques. In this work, we consider (1) the “one-pass” continuation-passing style (CPS) transformation, which is second-order, and (2) the occurrences...... of parameters of continuations in its output. To this end, we specify the one-pass CPS transformation relationally and we use the proof technique of logical relations....

  16. An Operational Investigation of the CPS Hierarchy

    DEFF Research Database (Denmark)

    Danvy, Olivier; Yang, Zhe

    1998-01-01

    We explore the hierarchy of control induced by successive transformations into continuation-passing style (CPS) in the presence of “control delimiters ” and “composable continuations ”. Specifically, we investigate the structural operational semantics associated with the CPS hierarchy. To this end...... a native and modular implementation of the entire hierarchy. We illustrate it with several examples, the most significant of which is layered monads....

  17. High speed preprocessing system

    Indian Academy of Sciences (India)

    M Sankar Kishore

    2000-10-01

    In systems employing tracking, the area of interest is recognized using a high resolution camera and is handed overto the low resolution receiver. The images seen by the low resolution receiver and by the operator through the high resolution camera are different in spatial resolution. In order to establish the correlation between these two images, the high-resolution camera image needsto be preprocessed and made similar to the low-resolution receiver image. This paper discusses the implementation of a suitable preprocessing technique, emphasis being given to develop a system both in hardware and software to reduce processing time. By applying different software/hardware techniques, the execution time has been brought down from a few seconds to a few milliseconds for a typical set of conditions. The hardware is designed around i486 processors and software is developed in PL/M. The system is tested to match the images obtained by two different sensors of the same scene. The hardware and software have been evaluated with different sets of images.

  18. An Extensional CPS Transform (Preliminary Report)

    DEFF Research Database (Denmark)

    Filinski, Andrzej

    2001-01-01

    We shoe that, in a language wihg general continuation-effects, the syntactic, or intensional, CPS transform is mirrored by a semantic, or extensional, functional term. In other words, form only the observable behavior any direct-style term (possibly containing the usual first-class continuation...... primitives), we can uniformly extract the observable behavior of its CPS counterpart. As a consequence of this result, we show that the computational lambda-calculus is complete for observational equivalence of pure, simply typed lambda-terms in Scheme-like contexts....

  19. An Extensional CPS Transform (Preliminary Report)

    DEFF Research Database (Denmark)

    Filinski, Andrzej

    2001-01-01

    We shoe that, in a language wihg general continuation-effects, the syntactic, or intensional, CPS transform is mirrored by a semantic, or extensional, functional term. In other words, form only the observable behavior any direct-style term (possibly containing the usual first-class continuation...... primitives), we can uniformly extract the observable behavior of its CPS counterpart. As a consequence of this result, we show that the computational lambda-calculus is complete for observational equivalence of pure, simply typed lambda-terms in Scheme-like contexts....

  20. Classical realizability in the CPS target language

    DEFF Research Database (Denmark)

    Frey, Jonas

    2016-01-01

    Motivated by considerations about Krivine's classical realizability, we introduce a term calculus for an intuitionistic logic with record types, which we call the CPS target language. We give a reformulation of the constructions of classical realizability in this language, using the categorical...... techniques of realizability triposes and toposes. We argue that the presentation of classical realizability in the CPS target language simplifies calculations in realizability toposes, in particular it admits a nice presentation of conjunction as intersection type which is inspired by Girard's ludics....

  1. On proving syntactic properties of CPS programs

    DEFF Research Database (Denmark)

    Danvy, Olivier; Dzafic, Belmina; Pfenning, Frank

    1999-01-01

    Higher-order program transformations raise new challenges for proving properties of their output, since they resist traditional, first-order proof techniques. In this work, we consider (1) the “one-pass” continuation-passing style (CPS) transformation, which is second-order, and (2) the occurrences...

  2. Preprocessing of NMR metabolomics data.

    Science.gov (United States)

    Euceda, Leslie R; Giskeødegård, Guro F; Bathen, Tone F

    2015-05-01

    Metabolomics involves the large scale analysis of metabolites and thus, provides information regarding cellular processes in a biological sample. Independently of the analytical technique used, a vast amount of data is always acquired when carrying out metabolomics studies; this results in complex datasets with large amounts of variables. This type of data requires multivariate statistical analysis for its proper biological interpretation. Prior to multivariate analysis, preprocessing of the data must be carried out to remove unwanted variation such as instrumental or experimental artifacts. This review aims to outline the steps in the preprocessing of NMR metabolomics data and describe some of the methods to perform these. Since using different preprocessing methods may produce different results, it is important that an appropriate pipeline exists for the selection of the optimal combination of methods in the preprocessing workflow.

  3. Preprocessing of raw metabonomic data.

    Science.gov (United States)

    Vettukattil, Riyas

    2015-01-01

    Recent advances in metabolic profiling techniques allow global profiling of metabolites in cells, tissues, or organisms, using a wide range of analytical techniques such as nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry (MS). The raw data acquired from these instruments are abundant with technical and structural complexity, which makes it statistically difficult to extract meaningful information. Preprocessing involves various computational procedures where data from the instruments (gas chromatography (GC)/liquid chromatography (LC)-MS, NMR spectra) are converted into a usable form for further analysis and biological interpretation. This chapter covers the common data preprocessing techniques used in metabonomics and is primarily focused on baseline correction, normalization, scaling, peak alignment, detection, and quantification. Recent years have witnessed development of several software tools for data preprocessing, and an overview of the frequently used tools in data preprocessing pipeline is covered.

  4. A Simple CPS Transformation of Control-Flow Information

    DEFF Research Database (Denmark)

    Damian, Daniel; Danvy, Olivier

    2002-01-01

    We build on Danvy and Nielsen's first-order program transformation into continuation-passing style (CPS) to design a new CPS transformation of flow information that is simpler and more efficient than what has been presented in previous work. The key to simplicity and efficiency is that our CPS...

  5. Data preprocessing in data mining

    CERN Document Server

    García, Salvador; Herrera, Francisco

    2015-01-01

    Data Preprocessing for Data Mining addresses one of the most important issues within the well-known Knowledge Discovery from Data process. Data directly taken from the source will likely have inconsistencies, errors or most importantly, it is not ready to be considered for a data mining process. Furthermore, the increasing amount of data in recent science, industry and business applications, calls to the requirement of more complex tools to analyze it. Thanks to data preprocessing, it is possible to convert the impossible into possible, adapting the data to fulfill the input demands of each data mining algorithm. Data preprocessing includes the data reduction techniques, which aim at reducing the complexity of the data, detecting or removing irrelevant and noisy elements from the data. This book is intended to review the tasks that fill the gap between the data acquisition from the source and the data mining process. A comprehensive look from a practical point of view, including basic concepts and surveying t...

  6. Optimal Preprocessing Of GPS Data

    Science.gov (United States)

    Wu, Sien-Chong; Melbourne, William G.

    1994-01-01

    Improved technique for preprocessing data from Global Positioning System receiver reduces processing time and number of data to be stored. Optimal in sense that it maintains strength of data. Also increases ability to resolve ambiguities in numbers of cycles of received GPS carrier signals.

  7. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2001-01-01

    We present a new transformation of λ-terms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Previous CPS transformations only enjoyed two out of the three properties of being first-order, one-pass, and compositional......, but the new transformation enjoys all three properties. It is proved correct directly by structural induction over source terms instead of indirectly with a colon translation, as in Plotkin's original proof. Similarly, it makes it possible to reason about CPS-transformed terms by structural induction over...... source terms, directly.The new CPS transformation connects separately published approaches to the CPS transformation. It has already been used to state a new and simpler correctness proof of a direct-style transformation, and to develop a new and simpler CPS transformation of control-flow information....

  8. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2003-01-01

    We present a new transformation of λ-terms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Previous CPS transformations only enjoyed two out of the three properties of being first-order, one-pass, and compositional......, but the new transformation enjoys all three properties. It is proved correct directly by structural induction over source terms instead of indirectly with a colon translation, as in Plotkin's original proof. Similarly, it makes it possible to reason about CPS-transformed terms by structural induction over...... source terms, directly.The new CPS transformation connects separately published approaches to the CPS transformation. It has already been used to state a new and simpler correctness proof of a direct-style transformation, and to develop a new and simpler CPS transformation of control-flow information....

  9. International Conference ML4CPS 2016

    CERN Document Server

    Niggemann, Oliver; Kühnert, Christian

    2017-01-01

    The work presents new approaches to Machine Learning for Cyber Physical Systems, experiences and visions. It contains some selected papers from the international Conference ML4CPS – Machine Learning for Cyber Physical Systems, which was held in Karlsruhe, September 29th, 2016. Cyber Physical Systems are characterized by their ability to adapt and to learn: They analyze their environment and, based on observations, they learn patterns, correlations and predictive models. Typical applications are condition monitoring, predictive maintenance, image processing and diagnosis. Machine Learning is the key technology for these developments. The Editors Prof. Dr.-Ing. Jürgen Beyerer is Professor at the Department for Interactive Real-Time Systems at the Karlsruhe Institute of Technology. In addition he manages the Fraunhofer Institute of Optronics, System Technologies and Image Exploitation IOSB. Prof. Dr. Oliver Niggemann is Professor for Embedded Software Engineering. His research interests are in the field of Di...

  10. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...... series forecasting models....

  11. Preprocessing of compressed digital video

    Science.gov (United States)

    Segall, C. Andrew; Karunaratne, Passant V.; Katsaggelos, Aggelos K.

    2000-12-01

    Pre-processing algorithms improve on the performance of a video compression system by removing spurious noise and insignificant features from the original images. This increases compression efficiency and attenuates coding artifacts. Unfortunately, determining the appropriate amount of pre-filtering is a difficult problem, as it depends on both the content of an image as well as the target bit-rate of compression algorithm. In this paper, we explore a pre- processing technique that is loosely coupled to the quantization decisions of a rate control mechanism. This technique results in a pre-processing system that operates directly on the Displaced Frame Difference (DFD) and is applicable to any standard-compatible compression system. Results explore the effect of several standard filters on the DFD. An adaptive technique is then considered.

  12. Virtual Quantum Subsystems

    OpenAIRE

    Zanardi, Paolo

    2001-01-01

    The physical resources available to access and manipulate the degrees of freedom of a quantum system define the set $\\cal A$ of operationally relevant observables. The algebraic structure of $\\cal A$ selects a preferred tensor product structure i.e., a partition into subsystems. The notion of compoundness for quantum system is accordingly relativized. Universal control over virtual subsystems can be achieved by using quantum noncommutative holonomies

  13. Virtual quantum subsystems.

    Science.gov (United States)

    Zanardi, P

    2001-08-13

    The physical resources available to access and manipulate the degrees of freedom of a quantum system define the set A of operationally relevant observables. The algebraic structure of A selects a preferred tensor product structure, i.e., a partition into subsystems. The notion of compoundness for quantum systems is accordingly relativized. Universal control over virtual subsystems can be achieved by using quantum noncommutative holonomies

  14. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2001-01-01

    We present a new transformation of λ-terms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Previous CPS transformations only enjoyed two out of the three properties of being first-order, one-pass, and compositional...

  15. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2002-01-01

    We present a new transformation of call-by-value lambdaterms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Because it operates in one pass, it directly yields compact CPS programs that are comparable to what one would...

  16. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2003-01-01

    We present a new transformation of λ-terms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Previous CPS transformations only enjoyed two out of the three properties of being first-order, one-pass, and compositional...

  17. An Operational Foundation for Delimited Continuations in the CPS Hierarchy

    DEFF Research Database (Denmark)

    Biernacka, Malgorzata; Biernacki, Dariusz; Danvy, Olivier

    2004-01-01

    We present an abstract machine and a reduction semantics for the lambda-calculus extended with control operators that give access to delimited continuations in the CPS hierarchy. The abstract machine is derived from an evaluator in continuation-passing style (CPS); the reduction semantics (i.......e., a small-step operational semantics with an explicit representation of evaluation contexts) is constructed from the abstract machine; and the control operators are the shift and reset family. We also present new applications of delimited continuations in the CPS hierarchy: finding list prefixes...

  18. Effective Feature Preprocessing for Time Series Forecasting

    DEFF Research Database (Denmark)

    Zhao, Junhua; Dong, Zhaoyang; Xu, Zhao

    2006-01-01

    Time series forecasting is an important area in data mining research. Feature preprocessing techniques have significant influence on forecasting accuracy, therefore are essential in a forecasting model. Although several feature preprocessing techniques have been applied in time series forecasting......, there is so far no systematic research to study and compare their performance. How to select effective techniques of feature preprocessing in a forecasting model remains a problem. In this paper, the authors conduct a comprehensive study of existing feature preprocessing techniques to evaluate their empirical...... performance in time series forecasting. It is demonstrated in our experiment that, effective feature preprocessing can significantly enhance forecasting accuracy. This research can be a useful guidance for researchers on effectively selecting feature preprocessing techniques and integrating them with time...

  19. A First-Order One-Pass CPS Transformation

    DEFF Research Database (Denmark)

    Danvy, Olivier; Nielsen, Lasse Reichstein

    2002-01-01

    We present a new transformation of call-by-value lambdaterms into continuation-passing style (CPS). This transformation operates in one pass and is both compositional and first-order. Because it operates in one pass, it directly yields compact CPS programs that are comparable to what one would...... write by hand. Because it is compositional, it allows proofs by structural induction. Because it is first-order, reasoning about it does not require the use of a logical relation. This new CPS transformation connects two separate lines of research. It has already been used to state a new and simpler...... correctness proof of a direct-style transformation, and to develop a new and simpler CPS transformation of control-flow information....

  20. CPS Transformation of Flow Information, Part II: Administrative Reductions

    DEFF Research Database (Denmark)

    Damian, Daniel; Danvy, Olivier

    2001-01-01

    consider the administrative reductions of a Plotkin-style transformation into Continuation-Passing Style (CPS), and how they affect the result of a constraint-based control-flow analysis and, in particular, the least element in the space of solutions. We show that administrative reductions preserve...... the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article ‘CPS Transformation of Flow Information.’ Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints...... of a program into the least solution of the flow constraints of the CPS counterpart of this program, after administrative reductions. Furthermore, we show how to CPS transform control-flow information in one pass....

  1. Multi-Task Collaboration CPS Modeling Based on Immune Feedback

    Directory of Open Access Journals (Sweden)

    Haiying Li

    2013-09-01

    Full Text Available In this paper, a dynamic multi-task collaboration CPS control model based on the self-adaptive immune feedback is proposed and implemented in the smart home environment. First, the internal relations between CPS and the biological immune system are explored via their basic theories. Second, CPS control mechanism is elaborated through the analysis of CPS control structure. Finally, a comprehensive strategy for support is introduced into multi-task collaboration to improve the dynamic cognitive ability. At the same time, the performance of parameters is correspondingly increased by the operator of the antibody concentration and the selective pressure. Furthermore, the model has been put into service in the smart home laboratory. The experimental results show that this model can integrate user’s needs into the environment for properly regulating the home environment.

  2. Optimal Planning of Communication System of CPS for Distribution Network

    Directory of Open Access Journals (Sweden)

    Ting Yang

    2017-01-01

    Full Text Available IoT is the technical basis to realize the CPS (Cyber Physical System for distribution networks, with which the complex system becomes more intelligent and controllable. Because of the multihop and self-organization characteristics, the large-scale heterogeneous CPS network becomes more difficult to plan. Using topological potential theory, one of typical big data analysis technologies, this paper proposed a novel optimal CPS planning model. Topological potential equalization is considered as the optimization objective function in heterogeneous CPS network with the constraints of communication requirements, physical infrastructures, and network reliability. An improved binary particle swarm optimization algorithm is proposed to solve this complex optimal problem. Two IEEE classic examples are adopted in the simulation, and the results show that, compared with benchmark algorithms, our proposed method can provide an effective topology optimization scheme to improve the network reliability and transmitting performance.

  3. Pressure Garment Subsystem Roadmap

    Science.gov (United States)

    Ross, Amy J.

    2010-01-01

    The Constellation program pressure garment subsystem (PGS) team has created a technical roadmap that communicates major technical questions and how and when the questions are being answered in support of major project milestones. The roadmap is a living document that guides the team priorities. The roadmap also communicates technical reactions to changes in project priorities and funding. This paper presents the roadmap and discusses specific roadmap elements in detail as representative examples to provide insight into the meaning and use of the roadmap.

  4. Preprocessing and Morphological Analysis in Text Mining

    Directory of Open Access Journals (Sweden)

    Krishna Kumar Mohbey Sachin Tiwari

    2011-12-01

    Full Text Available This paper is based on the preprocessing activities which is performed by the software or language translators before applying mining algorithms on the huge data. Text mining is an important area of Data mining and it plays a vital role for extracting useful information from the huge database or data ware house. But before applying the text mining or information extraction process, preprocessing is must because the given data or dataset have the noisy, incomplete, inconsistent, dirty and unformatted data. In this paper we try to collect the necessary requirements for preprocessing. When we complete the preprocess task then we can easily extract the knowledgful information using mining strategy. This paper also provides the information about the analysis of data like tokenization, stemming and semantic analysis like phrase recognition and parsing. This paper also collect the procedures for preprocessing data i.e. it describe that how the stemming, tokenization or parsing are applied.

  5. Environmental Control Subsystem Development

    Science.gov (United States)

    Laidlaw, Jacob; Zelik, Jonathan

    2017-01-01

    Kennedy Space Center's Launch Pad 39B, part of Launch Complex 39, is currently undergoing construction to prepare it for NASA's Space Launch System missions. The Environmental Control Subsystem, which provides the vehicle with an air or nitrogen gas environment, required development of its local and remote display screens. The remote displays, developed by NASA contractors and previous interns, were developed without complete functionality; the remote displays were revised, adding functionality to over 90 displays. For the local displays, multiple test procedures were developed to assess the functionality of the screens, as well as verify requirements. One local display screen was also developed.

  6. HETE Satellite Power Subsystem

    OpenAIRE

    1993-01-01

    The HETE (High-Energy Transient Experiment) satellite a joint project between MIT's Center for Space Research and AeroAstro. is a high-energy gamma-ray burst/X-Ray/UV observatory platform. HETE will be launched into a 550 km circular orbit with an inclination of 37.7°, and has a design lifetime of 18 months. This paper presents a description of the spacecraft's power subsystem, which collects, regulates, and distributes power to the experiment payload modules and to the various spacecraft sub...

  7. CPS Transformation of Flow Information, Part II: Administrative Reductions

    DEFF Research Database (Denmark)

    Damian, Daniel; Danvy, Olivier

    2001-01-01

    the least solution. Preservation of least solutions solves a problem that was left open in Palsberg and Wand's article ‘CPS Transformation of Flow Information.’ Together, Palsberg and Wand's article and the present article show how to map in linear time the least solution of the flow constraints...... of a program into the least solution of the flow constraints of the CPS counterpart of this program, after administrative reductions. Furthermore, we show how to CPS transform control-flow information in one pass.......We characterize the impact of a linear $\\beta$-reduction on the result of a control-flow analysis. (By ‘a linear $\\beta$-reduction’ we mean the $\\beta$-reduction of a linear $\\lambda$-abstraction, i.e., of a $\\lambda$-abstraction whose parameter occurs exactly once in its body.) As a corollary, we...

  8. Streptococcus agalactiae capsule polymer length and attachment is determined by the proteins CpsABCD.

    Science.gov (United States)

    Toniolo, Chiara; Balducci, Evita; Romano, Maria Rosaria; Proietti, Daniela; Ferlenghi, Ilaria; Grandi, Guido; Berti, Francesco; Ros, Immaculada Margarit Y; Janulczyk, Robert

    2015-04-10

    The production of capsular polysaccharides (CPS) or secreted exopolysaccharides is ubiquitous in bacteria, and the Wzy pathway constitutes a prototypical mechanism to produce these structures. Despite the differences in polysaccharide composition among species, a group of proteins involved in this pathway is well conserved. Streptococcus agalactiae (group B Streptococcus; GBS) produces a CPS that represents the main virulence factor of the bacterium and is a prime target in current vaccine development. We used this human pathogen to investigate the roles and potential interdependencies of the conserved proteins CpsABCD encoded in the cps operon, by developing knock-out and functional mutant strains. The mutant strains were examined for CPS quantity, size, and attachment to the cell surface as well as CpsD phosphorylation. We observed that CpsB, -C, and -D compose a phosphoregulatory system where the CpsD autokinase phosphorylates its C-terminal tyrosines in a CpsC-dependent manner. These Tyr residues are also the target of the cognate CpsB phosphatase. An interaction between CpsD and CpsC was observed, and the phosphorylation state of CpsD influenced the subsequent action of CpsC. The CpsC extracellular domain appeared necessary for the production of high molecular weight polysaccharides by influencing CpsA-mediated attachment of the CPS to the bacterial cell surface. In conclusion, although having no impact on cps transcription or the synthesis of the basal repeating unit, we suggest that these proteins are fine-tuning the last steps of CPS biosynthesis (i.e. the balance between polymerization and attachment to the cell wall).

  9. Facilitating Watermark Insertion by Preprocessing Media

    Directory of Open Access Journals (Sweden)

    Matt L. Miller

    2004-10-01

    Full Text Available There are several watermarking applications that require the deployment of a very large number of watermark embedders. These applications often have severe budgetary constraints that limit the computation resources that are available. Under these circumstances, only simple embedding algorithms can be deployed, which have limited performance. In order to improve performance, we propose preprocessing the original media. It is envisaged that this preprocessing occurs during content creation and has no budgetary or computational constraints. Preprocessing combined with simple embedding creates a watermarked Work, the performance of which exceeds that of simple embedding alone. However, this performance improvement is obtained without any increase in the computational complexity of the embedder. Rather, the additional computational burden is shifted to the preprocessing stage. A simple example of this procedure is described and experimental results confirm our assertions.

  10. SecureCPS: Defending a nanosatellite cyber-physical system

    Science.gov (United States)

    Forbes, Lance; Vu, Huy; Udrea, Bogdan; Hagar, Hamilton; Koutsoukos, Xenofon D.; Yampolskiy, Mark

    2014-06-01

    Recent inexpensive nanosatellite designs employ maneuvering thrusters, much as large satellites have done for decades. However, because a maneuvering nanosatellite can threaten HVAs on-­orbit, it must provide a level of security typically reserved for HVAs. Securing nanosatellites with maneuvering capability is challenging due to extreme cost, size, and power constraints. While still in the design process, our low-­cost SecureCPS architecture promises to dramatically improve security, to include preempting unknown binaries and detecting abnormal behavior. SecureCPS also applies to a broad class of cyber-­physical systems (CPS), such as aircraft, cars, and trains. This paper focuses on Embry-­Riddle's ARAPAIMA nanosatellite architecture, where we assume any off-­the-­shelf component could be compromised by a supply chain attack.1 Based on these assumptions, we have used Vanderbilt's Cyber Physical -­ Attack Description Language (CP-­ADL) to represent realistic attacks, analyze how these attacks propagate in the ARAPAIMA architecture, and how to defeat them using the combination of a low-­cost Root of Trust (RoT) Module, Global InfoTek's Advanced Malware Analysis System (GAMAS), and Anomaly Detection by Machine Learning (ADML).2 Our most recent efforts focus on refining and validating the design of SecureCPS.

  11. CPW to CPS transition for feeding UWB antennas

    DEFF Research Database (Denmark)

    Butrym, Alexander; Pivnenko, Sergey

    2004-01-01

    The paper considers a transition (balun) from Coplanar Waveguide (CPW) to Coplanar Stripline (CPS) which is non-resonant and suitable for feeding UWB antennas such as Tapered Slot Antennas (Vivaldi antennas in particular), bow-tie antennas, and other. Some numerical and experimental results...

  12. CPW to CPS transition for feeding UWB antennas

    DEFF Research Database (Denmark)

    Butrym, Alexander; Pivnenko, Sergey

    2006-01-01

    The paper considers a transition (balun) from Coplanar Waveguide (CPW) to Coplanar Stripline (CPS) which is non-resonant and suitable for feeding UWB antennas such as Tapered Slot Antennas (Vivaldi antennas, in particular), bow-tie antennas, and other. Some numerical and experimental results...

  13. An Operational Foundation for Delimited Continuations in the CPS Hierarchy

    DEFF Research Database (Denmark)

    Biernacka, Malgorzata; Biernacki, Dariusz; Danvy, Olivier

    2005-01-01

    .e., a small-step operational semantics with an explicit representation of evaluation contexts) is constructed from the abstract machine; and the control operators are the shift and reset family. We also present new applications of delimited continuations in the CPS hierarchy: finding list prefixes...

  14. An Operational Foundation for Delimited Continuations in the CPS Hierarchy

    DEFF Research Database (Denmark)

    Biernacka, Malgorzata; Biernacki, Dariusz; Danvy, Olivier

    2004-01-01

    .e., a small-step operational semantics with an explicit representation of evaluation contexts) is constructed from the abstract machine; and the control operators are the shift and reset family. We also present new applications of delimited continuations in the CPS hierarchy: finding list prefixes...

  15. Lambda-lifting and CPS conversion in an imperative language

    CERN Document Server

    Kerneis, Gabriel

    2012-01-01

    This paper is a companion technical report to the article "Continuation-Passing C: from threads to events through continuations". It contains the complete version of the proofs of correctness of lambda-lifting and CPS-conversion presented in the article.

  16. CPW to CPS transition for feeding UWB antennas

    DEFF Research Database (Denmark)

    Butrym, Alexander; Pivnenko, Sergey

    2004-01-01

    The paper considers a transition (balun) from Coplanar Waveguide (CPW) to Coplanar Stripline (CPS) which is non-resonant and suitable for feeding UWB antennas such as Tapered Slot Antennas (Vivaldi antennas in particular), bow-tie antennas, and other. Some numerical and experimental results...

  17. CPW to CPS transition for feeding UWB antennas

    DEFF Research Database (Denmark)

    Butrym, Alexander; Pivnenko, Sergey

    2006-01-01

    The paper considers a transition (balun) from Coplanar Waveguide (CPW) to Coplanar Stripline (CPS) which is non-resonant and suitable for feeding UWB antennas such as Tapered Slot Antennas (Vivaldi antennas, in particular), bow-tie antennas, and other. Some numerical and experimental results...

  18. Regional transmission subsystem planning

    Energy Technology Data Exchange (ETDEWEB)

    Costa Bortoni, Edson da [Quadrante Softwares Especializados Ltda., Itajuba, MG (Brazil); Bajay, Sergio Valdir; Barros Correia, Paulo de [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica; Santos, Afonso Henriques Moreira; Haddad, Jamil [Escola Federal de Engenharia de Itajuba, MG (Brazil)

    1994-12-31

    This work presents an approach for the planning of transmission systems by employing mixed--integer linear programming to obtain a cost and operating characteristics optimized system. The voltage loop equations are written in a modified form, so that, at the end of the analysis, the model behaves as a DC power flow, with the help of the two Kirchhoff`s laws, exempting the need of interaction with an external power flow program for analysis of the line loading. The model considers the occurrence of contingencies, so that the final result is a network robust to the most severe contingencies. This whole technique is adapted to the regional electric power transmission subsystems. (author) 9 refs., 4 figs.

  19. Operationally Responsive Spacecraft Subsystem Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Saber Astronautics proposes spacecraft subsystem control software which can autonomously reconfigure avionics for best performance during various mission conditions....

  20. Preprocessing of ionospheric echo Doppler spectra

    Institute of Scientific and Technical Information of China (English)

    FANG Liang; ZHAO Zhengyu; WANG Feng; SU Fanfan

    2007-01-01

    The real-time information of the distant ionosphere can be acquired by using the Wuhan ionospheric oblique backscattering sounding system(WIOBSS),which adopts a discontinuous wave mechanism.After the characteristics of the ionospheric echo Doppler spectra were analyzed,the signal preprocessing was developed in this paper,which aimed at improving the Doppler spectra.The results indicate that the preprocessing not only makes the system acquire a higher ability of target detection but also suppresses the radio frequency interference by 6-7 dB.

  1. Role of Ontologies for CPS Implementation in Manufacturing

    Directory of Open Access Journals (Sweden)

    Garetti Marco

    2015-12-01

    Full Text Available Cyber Physical Systems are an evolution of embedded systems featuring a tight combination of collaborating computational elements that control physical entities. CPSs promise a great potential of innovation in many areas including manufacturing and production. This is because we obtain a very powerful, flexible, modular infrastructure allowing easy (re configurability and fast ramp-up of manufacturing applications by building a manufacturing system with modular mechatronic components (for machining, transportation and storage and embedded intelligence, by integrating them into a system, through a network connection. However, when building such kind of architectures, the way to supply the needed domain knowledge to real manufacturing applications arises as a problem to solve. In fact, a CPS based architecture for manufacturing is made of smart but independent manufacturing components without any knowledge of the role they have to play together in the real world of manufacturing applications. Ontologies can supply such kind of knowledge, playing a very important role in CPS for manufacturing. The paper deals with this intriguing theme, also presenting an implementation of this approach in a research project for the open automation of manufacturing systems, in which the power of CPS is complemented by the support of an ontology of the manufacturing domain.

  2. Functional analysis of the CpsA protein of Streptococcus agalactiae.

    Science.gov (United States)

    Hanson, Brett R; Runft, Donna L; Streeter, Cale; Kumar, Abhin; Carion, Thomas W; Neely, Melody N

    2012-04-01

    Streptococcal pathogens, such as the group B streptococcus (GBS) Streptococcus agalactiae, are an important cause of systemic disease, which is facilitated in part by the presence of a polysaccharide capsule. The CpsA protein is a putative transcriptional regulator of the capsule locus, but its exact contribution to regulation is unknown. To address the role of CpsA in regulation, full-length GBS CpsA and two truncated forms of the protein were purified and analyzed for DNA-binding ability. Assays demonstrated that CpsA is able to bind specifically to two putative promoters within the capsule operon with similar affinity, and full-length protein is required for specificity. Functional characterization of CpsA confirmed that the ΔcpsA strain produced less capsule than did the wild type and demonstrated that the production of full-length CpsA or the DNA-binding region of CpsA resulted in increased capsule levels. In contrast, the production of a truncated form of CpsA lacking the extracellular LytR domain (CpsA-245) in the wild-type background resulted in a dominant-negative decrease in capsule production. GBS expressing CpsA-245, but not the ΔcpsA strain, was attenuated in human whole blood. However, the ΔcpsA strain showed significant attenuation in a zebrafish infection model. Furthermore, chain length was observed to be variable in a CpsA-dependent manner, but could be restored to wild-type levels when grown with lysozyme. Taken together, these results suggest that CpsA is a modular protein influencing multiple regulatory functions that may include not only capsule synthesis but also cell wall associated factors.

  3. Preprocessing Moist Lignocellulosic Biomass for Biorefinery Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Neal Yancey; Christopher T. Wright; Craig Conner; J. Richard Hess

    2009-06-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system of a lignocellulosic biorefinery. Preprocessing is generally accomplished using industrial grinders to format biomass materials into a suitable biorefinery feedstock for conversion to ethanol and other bioproducts. Many factors affect machine efficiency and the physical characteristics of preprocessed biomass. For example, moisture content of the biomass as received from the point of production has a significant impact on overall system efficiency and can significantly affect the characteristics (particle size distribution, flowability, storability, etc.) of the size-reduced biomass. Many different grinder configurations are available on the market, each with advantages under specific conditions. Ultimately, the capacity and/or efficiency of the grinding process can be enhanced by selecting the grinder configuration that optimizes grinder performance based on moisture content and screen size. This paper discusses the relationships of biomass moisture with respect to preprocessing system performance and product physical characteristics and compares data obtained on corn stover, switchgrass, and wheat straw as model feedstocks during Vermeer HG 200 grinder testing. During the tests, grinder screen configuration and biomass moisture content were varied and tested to provide a better understanding of their relative impact on machine performance and the resulting feedstock physical characteristics and uniformity relative to each crop tested.

  4. Preprocessing Moist Lignocellulosic Biomass for Biorefinery Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Neal Yancey; Christopher T. Wright; Craig Conner; J. Richard Hess

    2009-06-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system of a lignocellulosic biorefinery. Preprocessing is generally accomplished using industrial grinders to format biomass materials into a suitable biorefinery feedstock for conversion to ethanol and other bioproducts. Many factors affect machine efficiency and the physical characteristics of preprocessed biomass. For example, moisture content of the biomass as received from the point of production has a significant impact on overall system efficiency and can significantly affect the characteristics (particle size distribution, flowability, storability, etc.) of the size-reduced biomass. Many different grinder configurations are available on the market, each with advantages under specific conditions. Ultimately, the capacity and/or efficiency of the grinding process can be enhanced by selecting the grinder configuration that optimizes grinder performance based on moisture content and screen size. This paper discusses the relationships of biomass moisture with respect to preprocessing system performance and product physical characteristics and compares data obtained on corn stover, switchgrass, and wheat straw as model feedstocks during Vermeer HG 200 grinder testing. During the tests, grinder screen configuration and biomass moisture content were varied and tested to provide a better understanding of their relative impact on machine performance and the resulting feedstock physical characteristics and uniformity relative to each crop tested.

  5. Efficient Preprocessing technique using Web log mining

    Science.gov (United States)

    Raiyani, Sheetal A.; jain, Shailendra

    2012-11-01

    Web Usage Mining can be described as the discovery and Analysis of user access pattern through mining of log files and associated data from a particular websites. No. of visitors interact daily with web sites around the world. enormous amount of data are being generated and these information could be very prize to the company in the field of accepting Customerís behaviors. In this paper a complete preprocessing style having data cleaning, user and session Identification activities to improve the quality of data. Efficient preprocessing technique one of the User Identification which is key issue in preprocessing technique phase is to identify the Unique web users. Traditional User Identification is based on the site structure, being supported by using some heuristic rules, for use of this reduced the efficiency of user identification solve this difficulty we introduced proposed Technique DUI (Distinct User Identification) based on IP address ,Agent and Session time ,Referred pages on desired session time. Which can be used in counter terrorism, fraud detection and detection of unusual access of secure data, as well as through detection of regular access behavior of users improve the overall designing and performance of upcoming access of preprocessing results.

  6. Spacecraft Design Thermal Control Subsystem

    Science.gov (United States)

    Miyake, Robert N.

    2003-01-01

    This slide presentation reviews the functions of the thermal control subsystem engineers in the design of spacecraft. The goal of the thermal control subsystem that will be used in a spacecraft is to maintain the temperature of all spacecraft components, subsystems, and all the flight systems within specified limits for all flight modes from launch to the end of the mission. For most thermal control subsystems the mass, power and control and sensing systems must be kept below 10% of the total flight system resources. This means that the thermal control engineer is involved in all other flight systems designs. The two concepts of thermal control, passive and active are reviewed and the use of thermal modeling tools are explained. The testing of the thermal control is also reviewed.

  7. Analysis on Real time Perception Technology of Wireless Sensor Network in CPS

    Directory of Open Access Journals (Sweden)

    Zhou Benhai

    2016-01-01

    Full Text Available Cyber Physical Systems (CPS combines physical and computing systems tightly. Node operating systems (OS are fundamental units in CPS. There are still many problems unsolved when designing CPS especially CPS node OS in aspects of predictability, reliability, robustness, etc. Aiming at the problem, this paper proposes the effective shortest time priority algorithm and the adaptive shortest time priority algorithm. Experimental results show that, compared with the traditional FIFO (advanced first in first out and LSF (least slack first algorithm, the algorithm proposed in this paper effectively reduce the deadline miss ratio, as a result, the real-time performance of the CPS are effectively improved.

  8. New Magnets for the Transfer line PSB-CPS

    CERN Document Server

    Barnes, M J; Bossard, P; Clark, G S; Cornuet, D; Otter, Alan J; Reeve, P A; Sassowsky, M; CERN. Geneva. SPS and LEP Division

    1998-01-01

    The PS Accelerator Complex is upgraded for operation as LHC Pre-injector. A significant part of the effort is provided by TRIUMF under the CERN-TRIUMF collaboration on LHC. One of the tasks under the responsibility of TRIUMF is the design, construction and measurement of 17 magnets of four different types that replace existing magnets in the transfer line between the Proton Synchrotron Bo oster (PSB) and the CERN Proton Synchrotron (CPS). This note briefly describes the magnets and gives the results of the magnetic measurements.

  9. A PREPROCESSING LS-CMA IN HIGHLY CORRUPTIVE ENVIRONMENT

    Institute of Scientific and Technical Information of China (English)

    Guo Yan; Fang Dagang; Thomas N.C.Wang; Liang Changhong

    2002-01-01

    A fast preprocessing Least Square-Constant Modulus Algorithm (LS-CMA) is proposed for blind adaptive beamforming. This new preprocessing method precludes noise capture caused by the original LS-CMA with the preprocessing procedure controlled by the static Constant Modulus Algorithm (CMA). The simulation results have shown that the proposed fast preprocessing LS-CMA can effectively reject the co-channel interference, and quickly lock onto the constant modulus desired signal with only one snapshot in a highly corruptive environment.

  10. The preprocessing of multispectral data. II. [of Landsat satellite

    Science.gov (United States)

    Quiel, F.

    1976-01-01

    It is pointed out that a correction of atmospheric effects is an important requirement for a full utilization of the possibilities provided by preprocessing techniques. The most significant characteristics of original and preprocessed data are considered, taking into account the solution of classification problems by means of the preprocessing procedure. Improvements obtainable with different preprocessing techniques are illustrated with the aid of examples involving Landsat data regarding an area in Colorado.

  11. Realiability Of Heat Supply Subsystem

    Directory of Open Access Journals (Sweden)

    Babiarz Bożena

    2015-11-01

    Full Text Available The article presents the reliability analysis of subsystem of heat supply in the example of 47 thousand inhabitants’ city. The analysis was made on the basis of operational data made available by the Municipal Heating Company, between the years 2001 ÷ 2012. To describe the quantitative reliability of heat supply subsystem reliability indicators are used. Main times between failure and unitary failure rates including the month of their occurrence, type and diameter of the heating network, thermal power region were working out. Knowing the characteristics of the time to repair for the heating network, the reliability of heat supply subsystem for different thermal power region, considering the district heating system configuration can be determined.

  12. Syntactic Accidents in Program Analysis: On the Impact of the CPS Transformation

    DEFF Research Database (Denmark)

    Daniel, Damian; Danvy, Olivier

    2000-01-01

    We show that a non-duplicating transformation into Continuation-Passing Style (CPS) has no effect on control-flow analysis, a positive effect on binding-time analysis for traditional partial evaluation, and no effect on binding-time analysis for continuation-based partial evaluation: a monovariant...... control-flow analysis yields equivalent results on a direct-style program and on its CPS counterpart, a monovariant binding-time analysis yields less precise results on a direct-style program than on its CPS counterpart, and an enhanced monovariant binding-time analysis yields equivalent results...... on a direct-style program and on its CPS counterpart. Our proof technique amounts to constructing the CPS counterpart of flow information and of binding times. Our results formalize and confirm a folklore theorem about traditional binding-time analysis, namely that CPS has a positive effect on binding times...

  13. Syntactic accidents in program analysis: on the impact of the CPS transformation

    DEFF Research Database (Denmark)

    Damian, Daniel; Danvy, Olivier

    2003-01-01

    We show that a non-duplicating transformation into Continuation-Passing Style (CPS) has no effect on control-flow analysis, a positive effect on binding-time analysis for traditional partial evaluation, and no effect on binding-time analysis for continuation-based partial evaluation: a monovariant...... control-flow analysis yields equivalent results on a direct-style program and on its CPS counterpart, a monovariant binding-time analysis yields less precise results on a direct-style program than on its CPS counterpart, and an enhanced monovariant binding-time analysis yields equivalent results...... on a direct-style program and on its CPS counterpart. Our proof technique amounts to constructing the CPS counterpart of flow information and of binding times. Our results formalize and confirm a folklore theorem about traditional binding-time analysis, namely that CPS has a positive effect on binding times...

  14. Three Phase Cascade Converter Based on a Novel CPS-SPWM%基于新型CPS-SPWM的三相级联型多电平变流器

    Institute of Scientific and Technical Information of China (English)

    王立乔; 齐飞

    2011-01-01

    级联型多电平变流器常用的开关调制技术是载波相移正弦波脉宽调制(CPS-SPWM)技术,该技术需要大量的PWM发生器,在数字化实现过程中遇到了很大阻碍.因此,将单极性SPWM技术引入级联型多电平变流器中,提出了新型CPS-SPWM技术,比传统调制技术可以节省一半的PWM发生器,对级联型多电平变流器的数字化实现有重要的理论意义和实用价值.这里在介绍新型CPS-SPWM技术的基础上,详细介绍了新型CPS-SPWM技术的两种数字实现方法,并通过实验验证了理论分析的正确性.%In the common switch modulation technology, the carrier phase shift SPWM (CPS-SPWM) is widely used for the cascade multilevel converter. Huge obstacle happens during the large scale digital realization of CPS-SPWM,because it spends a lot of PWM generators.Therefore,a novel CPS-SPWM is put forwards by introducing unipolar SPWM into cascade multilevel converters.Compared with traditional modulation techniques,it can save half of the PWM generatots,which is of significant theoretical and practical value for the digital implementation of cascade multilevel converter. On the basis of briefly describing the basic principles of the novel CPS-SPWM,two digital implementation methods of the new CPS-SPWM are presented. Experimental results prove the correctness of theoretical analyses.

  15. Pre-processing Tasks in Indonesian Twitter Messages

    Science.gov (United States)

    Hidayatullah, A. F.; Ma’arif, M. R.

    2017-01-01

    Twitter text messages are very noisy. Moreover, tweet data are unstructured and complicated enough. The focus of this work is to investigate pre-processing technique for Twitter messages in Bahasa Indonesia. The main goal of this experiment is to clean the tweet data for further analysis. Thus, the objectives of this pre-processing task is simply removing all meaningless character and left valuable words. In this research, we divide our proposed pre-processing experiments into two parts. The first part is common pre-processing task. The second part is a specific pre-processing task for tweet data. From the experimental result we can conclude that by employing a specific pre-processing task related to tweet data characteristic we obtained more valuable result. The result obtained is better in terms of less meaningful word occurrence which is not significant in number comparing to the result obtained by just running common pre-processing tasks.

  16. Block storage subsystem performance analysis

    CERN Document Server

    CERN. Geneva

    2016-01-01

    You feel that your service is slow because of the storage subsystem? But there are too many abstraction layers between your software and the raw block device for you to debug all this pile... Let's dive on the platters and check out how the block storage sees your I/Os! We can even figure out what those patterns are meaning.

  17. Approximate Distance Oracles with Improved Preprocessing Time

    CERN Document Server

    Wulff-Nilsen, Christian

    2011-01-01

    Given an undirected graph $G$ with $m$ edges, $n$ vertices, and non-negative edge weights, and given an integer $k\\geq 1$, we show that for some universal constant $c$, a $(2k-1)$-approximate distance oracle for $G$ of size $O(kn^{1 + 1/k})$ can be constructed in $O(\\sqrt km + kn^{1 + c/\\sqrt k})$ time and can answer queries in $O(k)$ time. We also give an oracle which is faster for smaller $k$. Our results break the quadratic preprocessing time bound of Baswana and Kavitha for all $k\\geq 6$ and improve the $O(kmn^{1/k})$ time bound of Thorup and Zwick except for very sparse graphs and small $k$. When $m = \\Omega(n^{1 + c/\\sqrt k})$ and $k = O(1)$, our oracle is optimal w.r.t.\\ both stretch, size, preprocessing time, and query time, assuming a widely believed girth conjecture by Erd\\H{o}s.

  18. The SPICE Digital Shape Kernel (DSK) Subsystem

    Science.gov (United States)

    Bachman, N. J.

    2017-06-01

    The DSK subsystem is the component of SPICE concerned with detailed shape models. The DSK subsystem enables SPICE-based applications to conveniently and efficiently use detailed shape data in geometry computations performed by SPICE routines.

  19. Holonomic Quantum Computation in Subsystems

    Science.gov (United States)

    Oreshkov, Ognyan

    2009-08-01

    We introduce a generalized method of holonomic quantum computation (HQC) based on encoding in subsystems. As an application, we propose a scheme for applying holonomic gates to unencoded qubits by the use of a noisy ancillary qubit. This scheme does not require initialization in a subspace since all dynamical effects factor out as a transformation on the ancilla. We use this approach to show how fault-tolerant HQC can be realized via 2-local Hamiltonians with perturbative gadgets.

  20. Holonomic quantum computation in subsystems

    OpenAIRE

    Oreshkov, Ognyan

    2009-01-01

    We introduce a generalized method of holonomic quantum computation (HQC) based on encoding in subsystems. As an application, we propose a scheme for applying holonomic gates to unencoded qubits by the use of a noisy ancillary qubit. This scheme does not require initialization in a subspace since all dynamical effects factor out as a transformation on the ancilla. We use this approach to show how fault-tolerant HQC can be realized via 2-local Hamiltonians with perturbative gadgets.

  1. Low intensity beam extracted from the CPS via CT or MTE with various longitudinal parameters

    CERN Document Server

    Bohl, T

    2009-01-01

    In view of extracting fixed target type of beams from the CPS for SPS fixed target physics or CNGS operation with the Multi-Turn-Extraction (MTE) scheme, beams with certain sets of longitudinal parameters were produced in the CPS and their transmission in the SPS was studied.

  2. The Registration of Knee Joint Images with Preprocessing

    Directory of Open Access Journals (Sweden)

    Zhenyan Ji

    2011-06-01

    Full Text Available the registration of CT and MR images is important to analyze the effect of PCL and ACL deficiency on knee joint. Because CT and MR images have different limitations, we need register CT and MR images of knee joint and then build a model to do an analysis of the stress distribution on knee joint. In our project, we adopt image registration based on mutual information. In the knee joint images, the information about adipose, muscle and other soft tissue affects the registration accuracy. To eliminate the interference, we propose a combined preprocessing solution BEBDO, which consists of five steps, image blurring, image enhancement, image blurring, image edge detection and image outline preprocessing. We also designed the algorithm of image outline preprocessing. At the end of the paper, an experiment is done to compare the image registration results without the preprocessing and with the preprocessing. The results prove that the preprocessing can improve the image registration accuracy.

  3. An effective preprocessing method for finger vein recognition

    Science.gov (United States)

    Peng, JiaLiang; Li, Qiong; Wang, Ning; Abd El-Latif, Ahmed A.; Niu, Xiamu

    2013-07-01

    The image preprocessing plays an important role in finger vein recognition system. However, previous preprocessing schemes remind weakness to be resolved for the high finger vein recongtion performance. In this paper, we propose a new finger vein preprocessing that includes finger region localization, alignment, finger vein ROI segmentation and enhancement. The experimental results show that the proposed scheme is capable of enhancing the quality of finger vein image effectively and reliably.

  4. User microprogrammable processors for high data rate telemetry preprocessing

    Science.gov (United States)

    Pugsley, J. H.; Ogrady, E. P.

    1973-01-01

    The use of microprogrammable processors for the preprocessing of high data rate satellite telemetry is investigated. The following topics are discussed along with supporting studies: (1) evaluation of commercial microprogrammable minicomputers for telemetry preprocessing tasks; (2) microinstruction sets for telemetry preprocessing; and (3) the use of multiple minicomputers to achieve high data processing. The simulation of small microprogrammed processors is discussed along with examples of microprogrammed processors.

  5. Syntactic Accidents in Program Analysis: On the Impact of the CPS Transformation

    DEFF Research Database (Denmark)

    Daniel, Damian; Danvy, Olivier

    2000-01-01

    control-flow analysis yields equivalent results on a direct-style program and on its CPS counterpart, a monovariant binding-time analysis yields less precise results on a direct-style program than on its CPS counterpart, and an enhanced monovariant binding-time analysis yields equivalent results...... on a direct-style program and on its CPS counterpart. Our proof technique amounts to constructing the CPS counterpart of flow information and of binding times. Our results formalize and confirm a folklore theorem about traditional binding-time analysis, namely that CPS has a positive effect on binding times....... What may be more surprising is that the benefit does not arise from a standard refinement of program analysis, as, for instance, duplicating continuations. The present study is symptomatic of an unsettling property of program analyses: their quality is unpredictably vulnerable to syntactic accidents...

  6. Additional diterpenes from Physcomitrella patens synthesized by copalyl diphosphate/kaurene synthase (PpCPS/KS).

    Science.gov (United States)

    Zhan, Xin; Bach, Søren Spanner; Hansen, Nikolaj Lervad; Lunde, Christina; Simonsen, Henrik Toft

    2015-11-01

    The bifunctional diterpene synthase, copalyl diphosphate/kaurene synthase from the moss Physcomitrella patens (PpCPS/KS), catalyses the formation of at least four diterpenes, including ent-beyerene, ent-sandaracopimaradiene, ent-kaur-16-ene, and 16-hydroxy-ent-kaurene. The enzymatic activity has been confirmed through generation of a targeted PpCPS/KS knock-out mutant in P. patens via homologous recombination, through transient expression of PpCPS/KS in Nicotiana benthamiana, and expression of PpCPS/KS in E. coli. GC-MS analysis of the knock-out mutant shows that it lacks the diterpenoids, supporting that all are products of PpCPS/KS as observed in N. benthamiana and E. coli. These results provide additional knowledge of the mechanism of this bifunctional diterpene synthase, and are in line with proposed reaction mechanisms in kaurene biosynthesis.

  7. Phylogenetic distribution and membrane topology of the LytR-CpsA-Psr protein family

    Directory of Open Access Journals (Sweden)

    Berger-Bächi Brigitte

    2008-12-01

    Full Text Available Abstract Background The bacterial cell wall is the target of many antibiotics and cell envelope constituents are critical to host-pathogen interactions. To combat resistance development and virulence, a detailed knowledge of the individual factors involved is essential. Members of the LytR-CpsA-Psr family of cell envelope-associated attenuators are relevant for β-lactam resistance, biofilm formation, and stress tolerance, and they are suggested to play a role in cell wall maintenance. However, their precise function is still unknown. This study addresses the occurrence as well as sequence-based characteristics of the LytR-CpsA-Psr proteins. Results A comprehensive list of LytR-CpsA-Psr proteins was established, and their phylogenetic distribution and clustering into subgroups was determined. LytR-CpsA-Psr proteins were present in all Gram-positive organisms, except for the cell wall-deficient Mollicutes and one strain of the Clostridiales. In contrast, the majority of Gram-negatives did not contain LytR-CpsA-Psr family members. Despite high sequence divergence, the LytR-CpsA-Psr domains of different subclusters shared a highly similar, predicted mixed a/β-structure, and conserved charged residues. PhoA fusion experiments, using MsrR of Staphylococcus aureus, confirmed membrane topology predictions and extracellular location of its LytR-CpsA-Psr domain. Conclusion The LytR-CpsA-Psr domain is unique to bacteria. The presence of diverse subgroups within the LytR-CpsA-Psr family might indicate functional differences, and could explain variations in phenotypes of respective mutants reported. The identified conserved structural elements and amino acids are likely to be important for the function of the domain and will help to guide future studies of the LytR-CpsA-Psr proteins.

  8. Preprocessing and Analysis of Digitized ECGs

    Science.gov (United States)

    Villalpando, L. E. Piña; Kurmyshev, E.; Ramírez, S. Luna; Leal, L. Delgado

    2008-08-01

    In this work we propose a methodology and programs in MatlabTM that perform the preprocessing and analysis of the derivative D1 of ECGs. The program makes the correction to isoelectric line for each beat, calculates the average cardiac frequency and its standard deviation, generates a file of amplitude of P, Q and T waves, as well as the segments and intervals important of each beat. Software makes the normalization of beats to a standard rate of 80 beats per minute, the superposition of beats is done centering R waves, before and after normalizing the amplitude of each beat. The data and graphics provide relevant information to the doctor for diagnosis. In addition, some results are displayed similar to those presented by a Holter recording.

  9. FLPP NGL Structural Subsystems Activity

    Science.gov (United States)

    Jaredson, D.; Ramusat, G.; Appel, S.; Cardone, T.; Persson, J.; Baiocco, P.; Lavelle, F.; Bouilly, Th.

    2012-07-01

    The ESA Future Launchers Preparatory Programme (FLPP) is the basis for new paradigms, investigating the key elements, logic and roadmaps to prepare the development of the safe, reliable and low cost next European Launch Vehicle (LV) for access to space (dubbed NGL - Next Generation LV), with an initial operational capability mid-next decade. In addition to carry cargo to conventional GTO or SSO, the European NGL has to be flexible enough to cope with new pioneering institutional missions as well as the evolving commercial payloads market. This achievement is broached studying three main areas relevant to ELVs: System concepts, Propulsion and Core Technology During the preliminary design activity, a number of design alternatives concerning NGL main structural subsystems have been investigated. Technology is one of the ways to meet the NGL challenges to either improve the performances or to reduce the cost or both. The relevant requirements allow to steer a ‘top-down’ approach for their conception and to propose the most effective technologies. Furthermore, all these technology developments represent a significant ‘bottom-up’ approach investment and concern a large range of activities. The structural subsystems portfolio of the FLPP ‘Core Technology’ activity encompasses major cutting-edge challenges for maturation of the various subsystems leading to reduce overall structural mass, increasing structural margins for robustness, metallic and composite containment of cryogenic propellants, significantly reducing fabrication and operations cost, etc. to derive performing upper and booster stages. Application of concurrent engineering methods will allow developments of performing technology demonstrators in terms of need, demonstration objective, size and cost yielding to safe, low-risk technical approaches for a future development. Potential ability of these advanced structural LV technologies to satisfy the system requirements of the NGL and their current

  10. 信息物理融合系统(CPS)研究综述%Survey on the Research of Cyber-Physical Systems(CPS)

    Institute of Scientific and Technical Information of China (English)

    黎作鹏; 张天驰; 张菁

    2011-01-01

    Cyber-Physical Systems(CPS) refers to the tight conjoining of and coordination between computational and physical resources.and will change the way in which we interact with the physical world. As the evolution of Internet of Things;CPS has attracted extensive attention of research institutions;government departments and community in china and abroad. This survey introduced and described the CPS of definition; system architecture and features; in addation; mainly studied and discussed the CPS of theory and technology hierarchy; the important challenges toward computer science technology and situation of researches;finally the future perspective of research trends was presented.%信息物理融合系统(Cyber-Physical Systems,CPS)是将计算资源与物理资源紧密结合与协调的产物,它将改变人类与物理世界的交互方式.作为物联网的演进,CPS已经引起了国内外相关科研机构、政府部门和社会的广泛关注.介绍和阐述了CPS的定义、系统结构和特性,重点研究和讨论了CPS的理论技术体系、对计算机科学领域带来的重大挑战以及研究现状,最后展望了CPS的研究动向.

  11. The Effect of Preprocessing on Arabic Document Categorization

    Directory of Open Access Journals (Sweden)

    Abdullah Ayedh

    2016-04-01

    Full Text Available Preprocessing is one of the main components in a conventional document categorization (DC framework. This paper aims to highlight the effect of preprocessing tasks on the efficiency of the Arabic DC system. In this study, three classification techniques are used, namely, naive Bayes (NB, k-nearest neighbor (KNN, and support vector machine (SVM. Experimental analysis on Arabic datasets reveals that preprocessing techniques have a significant impact on the classification accuracy, especially with complicated morphological structure of the Arabic language. Choosing appropriate combinations of preprocessing tasks provides significant improvement on the accuracy of document categorization depending on the feature size and classification techniques. Findings of this study show that the SVM technique has outperformed the KNN and NB techniques. The SVM technique achieved 96.74% micro-F1 value by using the combination of normalization and stemming as preprocessing tasks.

  12. Forensic considerations for preprocessing effects on clinical MDCT scans.

    Science.gov (United States)

    Wade, Andrew D; Conlogue, Gerald J

    2013-05-01

    Manipulation of digital photographs destined for medico-legal inquiry must be thoroughly documented and presented with explanation of any manipulations. Unlike digital photography, computed tomography (CT) data must pass through an additional step before viewing. Reconstruction of raw data involves reconstruction algorithms to preprocess the raw information into display data. Preprocessing of raw data, although it occurs at the source, alters the images and must be accounted for in the same way as postprocessing. Repeated CT scans of a gunshot wound phantom were made using the Toshiba Aquilion 64-slice multidetector CT scanner. The appearance of fragments, high-density inclusion artifacts, and soft tissue were assessed. Preprocessing with different algorithms results in substantial differences in image output. It is important to appreciate that preprocessing affects the image, that it does so differently in the presence of high-density inclusions, and that preprocessing algorithms and scanning parameters may be used to overcome the resulting artifacts.

  13. Autophosphorylation of the Bacterial Tyrosine-Kinase CpsD Connects Capsule Synthesis with the Cell Cycle in Streptococcus pneumoniae.

    Directory of Open Access Journals (Sweden)

    Julien Nourikyan

    2015-09-01

    Full Text Available Bacterial capsular polysaccharides (CPS are produced by a multi-protein membrane complex, in which a particular type of tyrosine-autokinases named BY-kinases, regulate their polymerization and export. However, our understanding of the role of BY-kinases in these processes remains incomplete. In the human pathogen Streptococcus pneumoniae, the BY-kinase CpsD localizes at the division site and participates in the proper assembly of the capsule. In this study, we show that the cytoplasmic C-terminal end of the transmembrane protein CpsC is required for CpsD autophosphorylation and localization at mid-cell. Importantly, we demonstrate that the CpsC/CpsD complex captures the polysaccharide polymerase CpsH at the division site. Together with the finding that capsule is not produced at the division site in cpsD and cpsC mutants, these data show that CPS production occurs exclusively at mid-cell and is tightly dependent on CpsD interaction with CpsC. Next, we have analyzed the impact of CpsD phosphorylation on CPS production. We show that dephosphorylation of CpsD induces defective capsule production at the septum together with aberrant cell elongation and nucleoid defects. We observe that the cell division protein FtsZ assembles and localizes properly although cell constriction is impaired. DAPI staining together with localization of the histone-like protein HlpA further show that chromosome replication and/or segregation is defective suggesting that CpsD autophosphorylation interferes with these processes thus resulting in cell constriction defects and cell elongation. We show that CpsD shares structural homology with ParA-like ATPases and that it interacts with the chromosome partitioning protein ParB. Total internal reflection fluorescence microscopy imaging demonstrates that CpsD phosphorylation modulates the mobility of ParB. These data support a model in which phosphorylation of CpsD acts as a signaling system coordinating CPS synthesis with

  14. Feature detection techniques for preprocessing proteomic data.

    Science.gov (United States)

    Sellers, Kimberly F; Miecznikowski, Jeffrey C

    2010-01-01

    Numerous gel-based and nongel-based technologies are used to detect protein changes potentially associated with disease. The raw data, however, are abundant with technical and structural complexities, making statistical analysis a difficult task. Low-level analysis issues (including normalization, background correction, gel and/or spectral alignment, feature detection, and image registration) are substantial problems that need to be addressed, because any large-level data analyses are contingent on appropriate and statistically sound low-level procedures. Feature detection approaches are particularly interesting due to the increased computational speed associated with subsequent calculations. Such summary data corresponding to image features provide a significant reduction in overall data size and structure while retaining key information. In this paper, we focus on recent advances in feature detection as a tool for preprocessing proteomic data. This work highlights existing and newly developed feature detection algorithms for proteomic datasets, particularly relating to time-of-flight mass spectrometry, and two-dimensional gel electrophoresis. Note, however, that the associated data structures (i.e., spectral data, and images containing spots) used as input for these methods are obtained via all gel-based and nongel-based methods discussed in this manuscript, and thus the discussed methods are likewise applicable.

  15. Cassini Mission Sequence Subsystem (MSS)

    Science.gov (United States)

    Alland, Robert

    2011-01-01

    This paper describes my work with the Cassini Mission Sequence Subsystem (MSS) team during the summer of 2011. It gives some background on the motivation for this project and describes the expected benefit to the Cassini program. It then introduces the two tasks that I worked on - an automatic system auditing tool and a series of corrections to the Cassini Sequence Generator (SEQ_GEN) - and the specific objectives these tasks were to accomplish. Next, it details the approach I took to meet these objectives and the results of this approach, followed by a discussion of how the outcome of the project compares with my initial expectations. The paper concludes with a summary of my experience working on this project, lists what the next steps are, and acknowledges the help of my Cassini colleagues.

  16. 76 FR 75869 - Proposed Information Collection; Comment Request; Current Population Survey (CPS) Fertility...

    Science.gov (United States)

    2011-12-05

    ... maternal health care for single parent households, can be estimated using CPS characteristics matched with... Public: Individuals or Households. Estimated Number of Respondents: 30,000. Estimated Time per...

  17. Enhanced bone structural analysis through pQCT image preprocessing.

    Science.gov (United States)

    Cervinka, T; Hyttinen, J; Sievanen, H

    2010-05-01

    Several factors, including preprocessing of the image, can affect the reliability of pQCT-measured bone traits, such as cortical area and trabecular density. Using repeated scans of four different liquid phantoms and repeated in vivo scans of distal tibiae from 25 subjects, the performance of two novel preprocessing methods, based on the down-sampling of grayscale intensity histogram and the statistical approximation of image data, was compared to 3 x 3 and 5 x 5 median filtering. According to phantom measurements, the signal to noise ratio in the raw pQCT images (XCT 3000) was low ( approximately 20dB) which posed a challenge for preprocessing. Concerning the cortical analysis, the reliability coefficient (R) was 67% for the raw image and increased to 94-97% after preprocessing without apparent preference for any method. Concerning the trabecular density, the R-values were already high ( approximately 99%) in the raw images leaving virtually no room for improvement. However, some coarse structural patterns could be seen in the preprocessed images in contrast to a disperse distribution of density levels in the raw image. In conclusion, preprocessing cannot suppress the high noise level to the extent that the analysis of mean trabecular density is essentially improved, whereas preprocessing can enhance cortical bone analysis and also facilitate coarse structural analyses of the trabecular region.

  18. Thermal Hysteresis of MEMS Packaged Capacitive Pressure Sensor (CPS) Based 3C-SiC

    Science.gov (United States)

    Marsi, N.; Majlis, B. Y.; Mohd-Yasin, F.; Hamzah, A. A.; Mohd Rus, A. Z.

    2016-11-01

    Presented herein are the effects of thermal hysteresis analyses of the MEMS packaged capacitive pressure sensor (CPS). The MEMS CPS was employed on Si-on-3C-SiC wafer that was performed using the hot wall low-pressure chemical vapour deposition (LPCVD) reactors at the Queensland Micro and Nanotechnology Center (QMNC), Griffith University and fabricated using the bulk-micromachining process. The MEMS CPS was operated at an extreme temperature up to 500°C and high external pressure at 5.0 MPa. The thermal hysteresis phenomenon that causes the deflection, strain and stress on the 3C-SiC diaphragm spontaneously influence the MEMS CPS performances. The differences of temperature, hysteresis, and repeatability test were presented to demonstrate the functionality of the MEMS packaged CPS. As expected, the output hysteresis has a low hysteresis (less than 0.05%) which has the hardness greater than the traditional silicon. By utilizing this low hysteresis, it was revealed that the MEMS packaged CPS has high repeatability and stability of the sensor.

  19. LTE RF subsystem power consumption modeling

    DEFF Research Database (Denmark)

    Musiige, Deogratius; Vincent, Laulagnet; Anton, François;

    2012-01-01

    This paper presents a new power consumption emulation model, for all possible scenarios of the RF subsystem, when transmitting a LTE signal. The model takes the logical interface parameters, Tx power, carrier frequency and bandwidth between the baseband and RF subsystem as inputs to compute the p...

  20. An adaptive preprocessing algorithm for low bitrate video coding

    Institute of Scientific and Technical Information of China (English)

    LI Mao-quan; XU Zheng-quan

    2006-01-01

    At low bitrate, all block discrete cosine transform (BDCT) based video coding algorithms suffer from visible blocking and ringing artifacts in the reconstructed images because the quantization is too coarse and high frequency DCT coefficients are inclined to be quantized to zeros. Preprocessing algorithms can enhance coding efficiency and thus reduce the likelihood of blocking artifacts and ringing artifacts generated in the video coding process by applying a low-pass filter before video encoding to remove some relatively insignificant high frequent components. In this paper, we introduce a new adaptive preprocessing algorithm, which employs an improved bilateral filter to provide adaptive edge-preserving low-pass filtering which is adjusted according to the quantization parameters. Whether at low or high bit rate, the preprocessing can provide proper filtering to make the video encoder more efficient and have better reconstructed image quality. Experimental results demonstrate that our proposed preprocessing algorithm can significantly improve both subjective and objective quality.

  1. Solid Earth ARISTOTELES mission data preprocessing simulation of gravity gradiometer

    Science.gov (United States)

    Avanzi, G.; Stolfa, R.; Versini, B.

    Data preprocessing of the ARISTOTELES mission, which measures the Earth gravity gradient in a near polar orbit, was studied. The mission measures the gravity field at sea level through indirect measurements performed on the orbit, so that the evaluation steps consist in processing data from GRADIO accelerometer measurements. Due to the physical phenomena involved in the data collection experiment, it is possible to isolate at an initial stage a preprocessing of the gradiometer data based only on GRADIO measurements and not needing a detailed knowledge of the attitude and attitude rate sensors output. This preprocessing produces intermediate quantities used in future stages of the reduction. Software was designed and run to evaluate for this level of data reduction the achievable accuracy as a function of knowledge on instrument and satellite status parameters. The architecture of this element of preprocessing is described.

  2. Preprocessing Algorithm for Deciphering Historical Inscriptions Using String Metric

    Directory of Open Access Journals (Sweden)

    Lorand Lehel Toth

    2016-07-01

    Full Text Available The article presents the improvements in the preprocessing part of the deciphering method (shortly preprocessing algorithm for historical inscriptions of unknown origin. Glyphs used in historical inscriptions changed through time; therefore, various versions of the same script may contain different glyphs for each grapheme. The purpose of the preprocessing algorithm is reducing the running time of the deciphering process by filtering out the less probable interpretations of the examined inscription. However, the first version of the preprocessing algorithm leads incorrect outcome or no result in the output in certain cases. Therefore, its improved version was developed to find the most similar words in the dictionary by relaying the search conditions more accurately, but still computationally effectively. Moreover, a sophisticated similarity metric used to determine the possible meaning of the unknown inscription is introduced. The results of the evaluations are also detailed.

  3. A review of statistical methods for preprocessing oligonucleotide microarrays.

    Science.gov (United States)

    Wu, Zhijin

    2009-12-01

    Microarrays have become an indispensable tool in biomedical research. This powerful technology not only makes it possible to quantify a large number of nucleic acid molecules simultaneously, but also produces data with many sources of noise. A number of preprocessing steps are therefore necessary to convert the raw data, usually in the form of hybridisation images, to measures of biological meaning that can be used in further statistical analysis. Preprocessing of oligonucleotide arrays includes image processing, background adjustment, data normalisation/transformation and sometimes summarisation when multiple probes are used to target one genomic unit. In this article, we review the issues encountered in each preprocessing step and introduce the statistical models and methods in preprocessing.

  4. Efficient chaotic based satellite power supply subsystem

    Energy Technology Data Exchange (ETDEWEB)

    Ramos Turci, Luiz Felipe [Technological Institute of Aeronautics (ITA), Sao Jose dos Campos, SP (Brazil)], E-mail: felipeturci@yahoo.com.br; Macau, Elbert E.N. [National Institute of Space Research (Inpe), Sao Jose dos Campos, SP (Brazil)], E-mail: elbert@lac.inpe.br; Yoneyama, Takashi [Technological Institute of Aeronautics (ITA), Sao Jose dos Campos, SP (Brazil)], E-mail: takashi@ita.br

    2009-10-15

    In this work, we investigate the use of the Dynamical System Theory to increase the efficiency of the satellite power supply subsystems. The core of a satellite power subsystem relies on its DC/DC converter. This is a very nonlinear system that presents a multitude of phenomena ranging from bifurcations, quasi-periodicity, chaos, coexistence of attractors, among others. The traditional power subsystem design techniques try to avoid these nonlinear phenomena so that it is possible to use linear system theory in small regions about the equilibrium points. Here, we show that more efficiency can be drawn from a power supply subsystem if the DC/DC converter operates in regions of high nonlinearity. In special, if it operates in a chaotic regime, is has an intrinsic sensitivity that can be exploited to efficiently drive the power subsystem over high ranges of power requests by using control of chaos techniques.

  5. Preprocessing for classification of thermograms in breast cancer detection

    Science.gov (United States)

    Neumann, Łukasz; Nowak, Robert M.; Okuniewski, Rafał; Oleszkiewicz, Witold; Cichosz, Paweł; Jagodziński, Dariusz; Matysiewicz, Mateusz

    2016-09-01

    Performance of binary classification of breast cancer suffers from high imbalance between classes. In this article we present the preprocessing module designed to negate the discrepancy in training examples. Preprocessing module is based on standardization, Synthetic Minority Oversampling Technique and undersampling. We show how each algorithm influences classification accuracy. Results indicate that described module improves overall Area Under Curve up to 10% on the tested dataset. Furthermore we propose other methods of dealing with imbalanced datasets in breast cancer classification.

  6. SLAE–CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies

    Science.gov (United States)

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-01-01

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE–CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE–CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE–CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology. PMID:28657577

  7. SLAE-CPS: Smart Lean Automation Engine Enabled by Cyber-Physical Systems Technologies.

    Science.gov (United States)

    Ma, Jing; Wang, Qiang; Zhao, Zhibiao

    2017-06-28

    In the context of Industry 4.0, the demand for the mass production of highly customized products will lead to complex products and an increasing demand for production system flexibility. Simply implementing lean production-based human-centered production or high automation to improve system flexibility is insufficient. Currently, lean automation (Jidoka) that utilizes cyber-physical systems (CPS) is considered a cost-efficient and effective approach for improving system flexibility under shrinking global economic conditions. Therefore, a smart lean automation engine enabled by CPS technologies (SLAE-CPS), which is based on an analysis of Jidoka functions and the smart capacity of CPS technologies, is proposed in this study to provide an integrated and standardized approach to design and implement a CPS-based smart Jidoka system. A set of comprehensive architecture and standardized key technologies should be presented to achieve the above-mentioned goal. Therefore, a distributed architecture that joins service-oriented architecture, agent, function block (FB), cloud, and Internet of things is proposed to support the flexible configuration, deployment, and performance of SLAE-CPS. Then, several standardized key techniques are proposed under this architecture. The first one is for converting heterogeneous physical data into uniform services for subsequent abnormality analysis and detection. The second one is a set of Jidoka scene rules, which is abstracted based on the analysis of the operator, machine, material, quality, and other factors in different time dimensions. These Jidoka rules can support executive FBs in performing different Jidoka functions. Finally, supported by the integrated and standardized approach of our proposed engine, a case study is conducted to verify the current research results. The proposed SLAE-CPS can serve as an important reference value for combining the benefits of innovative technology and proper methodology.

  8. RNA interference-mediated repression of SmCPS (copalyldiphosphate synthase) expression in hairy roots of Salvia miltiorrhiza causes a decrease of tanshinones and sheds light on the functional role of SmCPS.

    Science.gov (United States)

    Cheng, Qiqing; Su, Ping; Hu, Yating; He, Yunfei; Gao, Wei; Huang, Luqi

    2014-02-01

    Tanshinones are a group of bioactive abietane-type norditerpenoid quinone compounds in Salvia miltiorrhiza. Copalyldiphosphate synthase of S. miltiorrhiza (SmCPS) is the first key enzyme in tanshinone biosynthesis from the universal diterpene precursor geranylgeranyl diphosphate. Hairy roots of S. miltiorrhiza were transformed with Agrobacterium rhizogenes carrying an RNA interference (RNAi) construct designed to silence SmCPS, and we examined the resulting SmCPS expression and tanshinone accumulation. In SmCPS–RNAi hairy roots, the transcript level of SmCPS was reduced to 26 % while the dihydrotanshinone I and cryptotanshinone levels were decreased by 53 and 38 % compared to those of the vector control hairy roots; tanshinone IIA was not detected. Therefore, the decreased expression of SmCPS caused a decrease in tanshinone levels which verifies that SmCPS is a key enzyme for tanshinone biosynthesis in S. miltiorrhiza.

  9. Impact of CPS1 Gene rs7422339 Polymorphism in Argentine Patients With Hyperhomocysteinemia

    Directory of Open Access Journals (Sweden)

    Silene M. Silvera-Ruiz BSc

    2015-05-01

    Full Text Available Carbamoyl phosphate synthetase 1 (CPS1 is a key gene in the first step of urea cycle and has been correlated with nitric oxide level and vascular smooth muscle activity. A functional single-nucleotide polymorphism C/A at position 4217 in CPS1 (National Center for Biotechnology Information SNP database no. rs7422339, T1405N was reported to be associated with high homocysteine (Hcy plasma values. Although genetic variants of methylenetetrahydrofolate reductase (MTHFR gene are known to influence Hcy concentration, other genetic determinants of Hcy remain largely unknown. The association between the CPS1 rs7422339 and the risk of hyperhomocysteinemia in Latin American populations is unknown. Here, we study this association in 100 patients having hyperhomocysteinemia without MTHFR c.677C>T polymorphism and 100 controls. CPS1 rs7422339 was studied using polymerase chain reaction and enzymatic restriction. Comparisons of the CPS1 rs7422339 genotype distributions revealed a significant difference between groups (P = 2.3 × 10−3. Patients carrying polymorphic allele showed almost 3 times higher risk (odds ratio [OR] = 2.47 of hyperhomocysteinemia than wild-type allele, suggesting that rs7422339 SNP is associated with high Hcy levels in the Argentine population.

  10. Development of technology for fabrication of lithium CPS on basis of CNT-reinforced carboxylic fabric

    Energy Technology Data Exchange (ETDEWEB)

    Tazhibayeva, Irina, E-mail: tazhibayeva@ntsc.kz [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Baklanov, Viktor; Ponkratov, Yuriy [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Abdullin, Khabibulla [Institute of Experimental and Theoretical Physics of Kazakh National University, Almaty (Kazakhstan); Kulsartov, Timur; Gordienko, Yuriy; Zaurbekova, Zhanna [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan); Lyublinski, Igor [JSC «Red Star», Moscow (Russian Federation); NRNU «MEPhI», Moscow (Russian Federation); Vertkov, Alexey [JSC «Red Star», Moscow (Russian Federation); Skakov, Mazhyn [Institute of Atomic Energy, National Nuclear Center of RK, Kurchatov (Kazakhstan)

    2017-04-15

    Highlights: • Preliminary study of carboxylic fabric wettability with liquid lithium is presented. • Preliminary studies of carboxylic fabric wettability with liquid lithium consist in carrying out of experiments at temperatures 673,773 and 873 К in vacuum during long time. • A scheme of experimental device for manufacturing of lithium CPS and matrix filling procedure with liquid lithium are presented. • The concept of lithium limiter with CPS on basis of CNT-reinforced carboxylic fabric is proposed. - Abstract: The paper describes the analysis of liquid lithium interaction with materials based on carbon, the manufacture technology of capillary-porous system (CPS) matrix on basis of CNT-reinforced carboxylic fabric. Preliminary study of carboxylic fabric wettability with liquid lithium is presented. The development of technology includes: microstructural studies of carboxylic fabric before its CNT-reinforcing; validation of CNT-reinforcing technology; mode validation of CVD-method for CNT synthesize; study of synthesized carbon structures. Preliminary studies of carboxylic fabric wettability with liquid lithium consist in carrying out of experiments at temperatures 673, 773 and 873 К in vacuum during long time. The scheme of experimental device for manufacturing of lithium CPS and matrix filling procedure with liquid lithium are presented. The concept of lithium limiter with CPS on basis of CNT-reinforced carboxylic fabric is proposed.

  11. Bounds on entanglement in qudit subsystems

    OpenAIRE

    Kendon, Vivien M.; Zyczkowski, Karol; Munro, William J.

    2002-01-01

    The entanglement in a pure state of N qudits (d-dimensional distinguishable quantum particles) can be characterised by specifying how entangled its subsystems are. A generally mixed subsystem of m qudits is obtained by tracing over the other N-m qudits. We examine the entanglement in the space of mixed states of m qudits. We show that for a typical pure state of N qudits, its subsystems smaller than N/3 qudits will have a positive partial transpose and hence are separable or bound entangled. ...

  12. An Anonymous Access Authentication Scheme Based on Proxy Ring Signature for CPS-WMNs

    Directory of Open Access Journals (Sweden)

    Tianhan Gao

    2017-01-01

    Full Text Available Access security and privacy have become a bottleneck for the popularization of future Cyber-Physical System (CPS networks. Furthermore, users’ need for privacy-preserved access during movement procedure is more urgent. To address the anonymous access authentication issue for CPS Wireless Mesh Network (CPS-WMN, a novel anonymous access authentication scheme based on proxy ring signature is proposed. A hierarchical authentication architecture is presented first. The scheme is then achieved from the aspect of intergroup and intragroup anonymous mutual authentication through proxy ring signature mechanism and certificateless signature mechanism, respectively. We present a formal security proof of the proposed protocol with SVO logic. The simulation and performance analysis demonstrate that the proposed scheme owns higher efficiency and adaptability than the typical one.

  13. CPS Modeling of CNC Machine Tool Work Processes Using an Instruction-Domain Based Approach

    Directory of Open Access Journals (Sweden)

    Jihong Chen

    2015-06-01

    Full Text Available Building cyber-physical system (CPS models of machine tools is a key technology for intelligent manufacturing. The massive electronic data from a computer numerical control (CNC system during the work processes of a CNC machine tool is the main source of the big data on which a CPS model is established. In this work-process model, a method based on instruction domain is applied to analyze the electronic big data, and a quantitative description of the numerical control (NC processes is built according to the G code of the processes. Utilizing the instruction domain, a work-process CPS model is established on the basis of the accurate, real-time mapping of the manufacturing tasks, resources, and status of the CNC machine tool. Using such models, case studies are conducted on intelligent-machining applications, such as the optimization of NC processing parameters and the health assurance of CNC machine tools.

  14. Bounds on entanglement in qudit subsystems

    CERN Document Server

    Kendon, V M; Munro, W J; Kendon, Vivien M; Zyczkowski, Karol; Munro, William J

    2002-01-01

    The entanglement in a pure state of N qudits (d-dimensional distinguishable quantum particles) can be characterised by specifying how entangled its subsystems are. A generally mixed subsystem of m qudits is obtained by tracing over the other N-m qudits. We examine the entanglement in this mixed space of m qudits. We show that for a typical pure state of N qudits, its subsystems smaller than N/3 qudits will have a positive partial transpose and hence are separable or bound entangled. Additionally, our numerical results show that the probability of finding entangled subsystems smaller than N/3 falls exponentially in the dimension of the Hilbert space. The bulk of pure state Hilbert space thus consists of highly entangled states with multipartite entanglement encompassing at least a third of the qudits in the pure state.

  15. Evaluating the impact of image preprocessing on iris segmentation

    Directory of Open Access Journals (Sweden)

    José F. Valencia-Murillo

    2014-08-01

    Full Text Available Segmentation is one of the most important stages in iris recognition systems. In this paper, image preprocessing algorithms are applied in order to evaluate their impact on successful iris segmentation. The preprocessing algorithms are based on histogram adjustment, Gaussian filters and suppression of specular reflections in human eye images. The segmentation method introduced by Masek is applied on 199 images acquired under unconstrained conditions, belonging to the CASIA-irisV3 database, before and after applying the preprocessing algorithms. Then, the impact of image preprocessing algorithms on the percentage of successful iris segmentation is evaluated by means of a visual inspection of images in order to determine if circumferences of iris and pupil were detected correctly. An increase from 59% to 73% in percentage of successful iris segmentation is obtained with an algorithm that combine elimination of specular reflections, followed by the implementation of a Gaussian filter having a 5x5 kernel. The results highlight the importance of a preprocessing stage as a previous step in order to improve the performance during the edge detection and iris segmentation processes.

  16. Effect of microaerobic fermentation in preprocessing fibrous lignocellulosic materials.

    Science.gov (United States)

    Alattar, Manar Arica; Green, Terrence R; Henry, Jordan; Gulca, Vitalie; Tizazu, Mikias; Bergstrom, Robby; Popa, Radu

    2012-06-01

    Amending soil with organic matter is common in agricultural and logging practices. Such amendments have benefits to soil fertility and crop yields. These benefits may be increased if material is preprocessed before introduction into soil. We analyzed the efficiency of microaerobic fermentation (MF), also referred to as Bokashi, in preprocessing fibrous lignocellulosic (FLC) organic materials using varying produce amendments and leachate treatments. Adding produce amendments increased leachate production and fermentation rates and decreased the biological oxygen demand of the leachate. Continuously draining leachate without returning it to the fermentors led to acidification and decreased concentrations of polysaccharides (PS) in leachates. PS fragmentation and the production of soluble metabolites and gases stabilized in fermentors in about 2-4 weeks. About 2 % of the carbon content was lost as CO(2). PS degradation rates, upon introduction of processed materials into soil, were similar to unfermented FLC. Our results indicate that MF is insufficient for adequate preprocessing of FLC material.

  17. Exploration, visualization, and preprocessing of high-dimensional data.

    Science.gov (United States)

    Wu, Zhijin; Wu, Zhiqiang

    2010-01-01

    The rapid advances in biotechnology have given rise to a variety of high-dimensional data. Many of these data, including DNA microarray data, mass spectrometry protein data, and high-throughput screening (HTS) assay data, are generated by complex experimental procedures that involve multiple steps such as sample extraction, purification and/or amplification, labeling, fragmentation, and detection. Therefore, the quantity of interest is not directly obtained and a number of preprocessing procedures are necessary to convert the raw data into the format with biological relevance. This also makes exploratory data analysis and visualization essential steps to detect possible defects, anomalies or distortion of the data, to test underlying assumptions and thus ensure data quality. The characteristics of the data structure revealed in exploratory analysis often motivate decisions in preprocessing procedures to produce data suitable for downstream analysis. In this chapter we review the common techniques in exploring and visualizing high-dimensional data and introduce the basic preprocessing procedures.

  18. Data Preprocessing in Cluster Analysis of Gene Expression

    Institute of Scientific and Technical Information of China (English)

    杨春梅; 万柏坤; 高晓峰

    2003-01-01

    Considering that the DNA microarray technology has generated explosive gene expression data and that it is urgent to analyse and to visualize such massive datasets with efficient methods, we investigate the data preprocessing methods used in cluster analysis, normalization or logarithm of the matrix, by using hierarchical clustering, principal component analysis (PCA) and self-organizing maps (SOMs). The results illustrate that when using the Euclidean distance as measuring metrics, logarithm of relative expression level is the best preprocessing method, while data preprocessed by normalization cannot attain the expected results because the data structure is ruined. If there are only a few principal components, the PCA is an effective method to extract the frame structure, while SOMs are more suitable for a specific structure.

  19. Fabrication of Ag/CPs composite material, an effective strategy to improve the photocatalytic performance of coordination polymers under visible irradiation.

    Science.gov (United States)

    Xu, Xinxin; Cui, Zhongping; Qi, Ji; Liu, Xiaoxia

    2013-10-01

    To enhance the photocatalytic property of coordination polymers (CPs) in the visible light region, Ag loaded coordination polymer composite materials (Ag/CPs) were synthesized successfully through a photoreduction reaction of Ag(+) on the surface of CPs. Photoluminescence (PL) was used to investigate the separation of photogenerated electron-hole pairs and the results illustrated Ag/CPs display higher quantum yields than CPs. This can be attributed to the strong interactions between Ag nanorods and coordination polymers, which lead to electron-hole pair separation between Ag nanorods and CPs. The degradation of Rhodamine B (RhB) was investigated to study the photocatalytic activities. Ag/CPs exhibited excellent photocatalytic activity in the UV and visible light region, while CPs can only decompose RhB under the irradiation of UV light. Furthermore, Ag/CPs showed outstanding stability during degradation of RhB.

  20. Micro-Analyzer: automatic preprocessing of Affymetrix microarray data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Cannataro, Mario

    2013-08-01

    A current trend in genomics is the investigation of the cell mechanism using different technologies, in order to explain the relationship among genes, molecular processes and diseases. For instance, the combined use of gene-expression arrays and genomic arrays has been demonstrated as an effective instrument in clinical practice. Consequently, in a single experiment different kind of microarrays may be used, resulting in the production of different types of binary data (images and textual raw data). The analysis of microarray data requires an initial preprocessing phase, that makes raw data suitable for use on existing analysis platforms, such as the TIGR M4 (TM4) Suite. An additional challenge to be faced by emerging data analysis platforms is the ability to treat in a combined way those different microarray formats coupled with clinical data. In fact, resulting integrated data may include both numerical and symbolic data (e.g. gene expression and SNPs regarding molecular data), as well as temporal data (e.g. the response to a drug, time to progression and survival rate), regarding clinical data. Raw data preprocessing is a crucial step in analysis but is often performed in a manual and error prone way using different software tools. Thus novel, platform independent, and possibly open source tools enabling the semi-automatic preprocessing and annotation of different microarray data are needed. The paper presents Micro-Analyzer (Microarray Analyzer), a cross-platform tool for the automatic normalization, summarization and annotation of Affymetrix gene expression and SNP binary data. It represents the evolution of the μ-CS tool, extending the preprocessing to SNP arrays that were not allowed in μ-CS. The Micro-Analyzer is provided as a Java standalone tool and enables users to read, preprocess and analyse binary microarray data (gene expression and SNPs) by invoking TM4 platform. It avoids: (i) the manual invocation of external tools (e.g. the Affymetrix Power

  1. Automated glycan assembly of a S. pneumoniae serotype 3 CPS antigen

    Science.gov (United States)

    Weishaupt, Markus W; Matthies, Stefan; Hurevich, Mattan; Pereira, Claney L; Hahm, Heung Sik

    2016-01-01

    Summary Vaccines against S. pneumoniae, one of the most prevalent bacterial infections causing severe disease, rely on isolated capsular polysaccharide (CPS) that are conjugated to proteins. Such isolates contain a heterogeneous oligosaccharide mixture of different chain lengths and frame shifts. Access to defined synthetic S. pneumoniae CPS structures is desirable. Known syntheses of S. pneumoniae serotype 3 CPS rely on a time-consuming and low-yielding late-stage oxidation step, or use disaccharide building blocks which limits variability. Herein, we report the first iterative automated glycan assembly (AGA) of a conjugation-ready S. pneumoniae serotype 3 CPS trisaccharide. This oligosaccharide was assembled using a novel glucuronic acid building block to circumvent the need for a late-stage oxidation. The introduction of a washing step with the activator prior to each glycosylation cycle greatly increased the yields by neutralizing any residual base from deprotection steps in the synthetic cycle. This process improvement is applicable to AGA of many other oligosaccharides. PMID:27559395

  2. 77 FR 58510 - Proposed Information Collection; Comment Request; Current Population Survey (CPS), Annual Social...

    Science.gov (United States)

    2012-09-21

    ... Survey (CPS), Annual Social and Economic Supplement (ASEC) AGENCY: U.S. Census Bureau, Commerce. ACTION... Annual Social and Economic Supplement (ASEC) to be conducted in conjunction with the February, March, and... casual attachment to the labor market. The income data from the ASEC are used by social...

  3. Automated glycan assembly of a S. pneumoniae serotype 3 CPS antigen

    Directory of Open Access Journals (Sweden)

    Markus W. Weishaupt

    2016-07-01

    Full Text Available Vaccines against S. pneumoniae, one of the most prevalent bacterial infections causing severe disease, rely on isolated capsular polysaccharide (CPS that are conjugated to proteins. Such isolates contain a heterogeneous oligosaccharide mixture of different chain lengths and frame shifts. Access to defined synthetic S. pneumoniae CPS structures is desirable. Known syntheses of S. pneumoniae serotype 3 CPS rely on a time-consuming and low-yielding late-stage oxidation step, or use disaccharide building blocks which limits variability. Herein, we report the first iterative automated glycan assembly (AGA of a conjugation-ready S. pneumoniae serotype 3 CPS trisaccharide. This oligosaccharide was assembled using a novel glucuronic acid building block to circumvent the need for a late-stage oxidation. The introduction of a washing step with the activator prior to each glycosylation cycle greatly increased the yields by neutralizing any residual base from deprotection steps in the synthetic cycle. This process improvement is applicable to AGA of many other oligosaccharides.

  4. The CPS Plasma Award at the Intel Science and Engineering Fair

    Science.gov (United States)

    Berry, Lee

    2012-10-01

    For the past eight years, the Coalition for Plasma Science (CPS) has presented an award for a plasma project at the Intel International Science and Engineering Fair (ISEF). We reported on the first five years of this award at the 2009 DPP Symposium. Pulsed neutron-producing experiments are a recurring topic, with the efforts now turning to applications. The most recent award at the Pittsburgh ISEF this past May was given for analysis of data from Brookhaven's Relativistic Heavy Ion Collider. The effort had the goal of understanding the fluid properties of the quark-gluon plasma. All of the CPS award-winning projects so far have been based on experiments, with four awards going to women students and four to men. In 2009 we noted that the number and quality of projects was improving. Since then, as we we predicted (hoped for), that trend has continued. The CPS looks forward to continuing its work with students who are excited about the possibilities of plasma. You too can share this excitement by judging at the 2013 fair in Phoenix on May 12-17. Information may be obtained by emailing cps@plasmacoalition.org.

  5. Working to My Potential: The Postsecondary Experiences of CPS Students in the International Baccalaureate Diploma Programme

    Science.gov (United States)

    Coca, Vanessa; Johnson, David; Kelley-Kemple, Thomas; Roderick, Melissa; Moeller, Eliza; Williams, Nicole; Moragne, Kafi

    2012-01-01

    In 1997, Chicago Public Schools (CPS) announced an ambitious plan to open 13 International Baccalaureate Diploma Programs (IBDP) in neighborhood high schools throughout the city. Hoping to replicate the success achieved in the long-standing IB program at Lincoln Park High School, the scale of the IB experiment was unmatched by any other school…

  6. Wikis for a Collaborative Problem-Solving (CPS) Module for Secondary School Science

    Science.gov (United States)

    DeWitt, Dorothy; Alias, Norlidah; Siraj, Saedah; Spector, Jonathan Michael

    2017-01-01

    Collaborative problem solving (CPS) can support online learning by enabling interactions for social and cognitive processes. Teachers may not have sufficient knowledge to support such interactions, so support needs to be designed into learning modules for this purpose. This study investigates to what extent an online module for teaching nutrition…

  7. MPM4CPS: multi-pardigm modelling for cyber-physical systems

    NARCIS (Netherlands)

    Vangeheluwe, Hans; Ameral, Vasco; Giese, Holger; Broenink, Johannes F.; Schätz, Bernhard; Norta, Alexander; Carreira, Paulo; Lukovic, Ivan; Mayerhofer, Tanja; Wimmer, Manuel; Vellecillo, Antonio

    2016-01-01

    The last decades have seen the emergence of truly complex, designed systems, known as Cyber-Physical Systems (CPS). Engineering such systems requires integrating physical, software, and network aspects. To date, neither a unifying theory nor systematic design methods, techniques and tools exist to m

  8. MPM4CPS: multi-pardigm modelling for cyber-physical systems

    NARCIS (Netherlands)

    Vangeheluwe, Hans; Ameral, Vasco; Giese, Holger; Broenink, Jan; Schätz, Bernhard; Norta, Alexander; Carreira, Paulo; Lukovic, Ivan; Mayerhofer, Tanja; Wimmer, Manuel; Vellecillo, Antonio

    2016-01-01

    The last decades have seen the emergence of truly complex, designed systems, known as Cyber-Physical Systems (CPS). Engineering such systems requires integrating physical, software, and network aspects. To date, neither a unifying theory nor systematic design methods, techniques and tools exist to m

  9. 78 FR 45910 - Proposed Information Collection; Comment Request; Current Population Survey (CPS) Email Address...

    Science.gov (United States)

    2013-07-30

    ... Census Bureau Proposed Information Collection; Comment Request; Current Population Survey (CPS) Email... concerning the November 2013 Email Address Collection Test Supplement. The Census Bureau and the Bureau of.... We foresee that in the future, we could collect email addresses from our respondents. For those that...

  10. The Effectiveness of CPS-ALM Model in Enhancing Statistical Literacy Ability and Self Concept of Elementary School Student Teacher

    Science.gov (United States)

    Takaria, J.; Rumahlatu, D.

    2016-01-01

    The focus of this study is to examine comprehensively statistical literacy and self-concept enhancement of elementary school student teacher through CPS-BML model in which this enhancement is measured through N-gain. The result of study indicate that the use of Collaborative Problem Solving Model assisted by literacy media (CPS-ALM) model…

  11. On similitude law of sub-systems

    Institute of Scientific and Technical Information of China (English)

    Zach Liang; George C. Lee

    2006-01-01

    Based on Buckingham's π-Theorem, dimensional analysis has achieved considerable success over the past near-century. Model testing has long been a powerful tool in both scientific studies and engineering applications. However,the prototype objects are becoming more and more complicated nowadays, and many of the prototype systems can contain several sub-systems. The conventional theories on model-prototype similarity and dimensional analysis have only limited application since the π -Theorem itself does not distinguish between the original system and subsystems. This is particularly true in the field of structural dynamics, where the structure is often modeled as a multi-degree-of-freedom system. In this paper, we attempt to show that, if a system can be decoupled into several nontrivial subsystems, then, in each subsystem,the number of π -terms will be reduced and therefore simplify the model testing. On the other hand, if a system cannot be decoupled into subsystems, then using model testing with reduced π -term analysis, both experimentally and theoretically,may introduce severe errors.

  12. Image preprocessing study on KPCA-based face recognition

    Science.gov (United States)

    Li, Xuan; Li, Dehua

    2015-12-01

    Face recognition as an important biometric identification method, with its friendly, natural, convenient advantages, has obtained more and more attention. This paper intends to research a face recognition system including face detection, feature extraction and face recognition, mainly through researching on related theory and the key technology of various preprocessing methods in face detection process, using KPCA method, focuses on the different recognition results in different preprocessing methods. In this paper, we choose YCbCr color space for skin segmentation and choose integral projection for face location. We use erosion and dilation of the opening and closing operation and illumination compensation method to preprocess face images, and then use the face recognition method based on kernel principal component analysis method for analysis and research, and the experiments were carried out using the typical face database. The algorithms experiment on MATLAB platform. Experimental results show that integration of the kernel method based on PCA algorithm under certain conditions make the extracted features represent the original image information better for using nonlinear feature extraction method, which can obtain higher recognition rate. In the image preprocessing stage, we found that images under various operations may appear different results, so as to obtain different recognition rate in recognition stage. At the same time, in the process of the kernel principal component analysis, the value of the power of the polynomial function can affect the recognition result.

  13. Pre-Processing Rules for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den

    2003-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a network’s graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique siz

  14. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique

  15. The minimal preprocessing pipelines for the Human Connectome Project.

    Science.gov (United States)

    Glasser, Matthew F; Sotiropoulos, Stamatios N; Wilson, J Anthony; Coalson, Timothy S; Fischl, Bruce; Andersson, Jesper L; Xu, Junqian; Jbabdi, Saad; Webster, Matthew; Polimeni, Jonathan R; Van Essen, David C; Jenkinson, Mark

    2013-10-15

    The Human Connectome Project (HCP) faces the challenging task of bringing multiple magnetic resonance imaging (MRI) modalities together in a common automated preprocessing framework across a large cohort of subjects. The MRI data acquired by the HCP differ in many ways from data acquired on conventional 3 Tesla scanners and often require newly developed preprocessing methods. We describe the minimal preprocessing pipelines for structural, functional, and diffusion MRI that were developed by the HCP to accomplish many low level tasks, including spatial artifact/distortion removal, surface generation, cross-modal registration, and alignment to standard space. These pipelines are specially designed to capitalize on the high quality data offered by the HCP. The final standard space makes use of a recently introduced CIFTI file format and the associated grayordinate spatial coordinate system. This allows for combined cortical surface and subcortical volume analyses while reducing the storage and processing requirements for high spatial and temporal resolution data. Here, we provide the minimum image acquisition requirements for the HCP minimal preprocessing pipelines and additional advice for investigators interested in replicating the HCP's acquisition protocols or using these pipelines. Finally, we discuss some potential future improvements to the pipelines.

  16. OPSN: The IMS COMSYS 1 and 2 Data Preprocessing System.

    Science.gov (United States)

    Yu, John

    The Instructional Management System (IMS) developed by the Southwest Regional Laboratory (SWRL) processes student and teacher-generated data through the use of an optical scanner that produces a magnetic tape (Scan Tape) for input to IMS. A series of computer routines, OPSN, preprocesses the Scan Tape and prepares the data for transmission to the…

  17. An effective measured data preprocessing method in electrical impedance tomography.

    Science.gov (United States)

    Yu, Chenglong; Yue, Shihong; Wang, Jianpei; Wang, Huaxiang

    2014-01-01

    As an advanced process detection technology, electrical impedance tomography (EIT) has widely been paid attention to and studied in the industrial fields. But the EIT techniques are greatly limited to the low spatial resolutions. This problem may result from the incorrect preprocessing of measuring data and lack of general criterion to evaluate different preprocessing processes. In this paper, an EIT data preprocessing method is proposed by all rooting measured data and evaluated by two constructed indexes based on all rooted EIT measured data. By finding the optimums of the two indexes, the proposed method can be applied to improve the EIT imaging spatial resolutions. In terms of a theoretical model, the optimal rooting times of the two indexes range in [0.23, 0.33] and in [0.22, 0.35], respectively. Moreover, these factors that affect the correctness of the proposed method are generally analyzed. The measuring data preprocessing is necessary and helpful for any imaging process. Thus, the proposed method can be generally and widely used in any imaging process. Experimental results validate the two proposed indexes.

  18. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum clique

  19. Embedded Thermal Control for Spacecraft Subsystems Miniaturization

    Science.gov (United States)

    Didion, Jeffrey R.

    2014-01-01

    Optimization of spacecraft size, weight and power (SWaP) resources is an explicit technical priority at Goddard Space Flight Center. Embedded Thermal Control Subsystems are a promising technology with many cross cutting NSAA, DoD and commercial applications: 1.) CubeSatSmallSat spacecraft architecture, 2.) high performance computing, 3.) On-board spacecraft electronics, 4.) Power electronics and RF arrays. The Embedded Thermal Control Subsystem technology development efforts focus on component, board and enclosure level devices that will ultimately include intelligent capabilities. The presentation will discuss electric, capillary and hybrid based hardware research and development efforts at Goddard Space Flight Center. The Embedded Thermal Control Subsystem development program consists of interrelated sub-initiatives, e.g., chip component level thermal control devices, self-sensing thermal management, advanced manufactured structures. This presentation includes technical status and progress on each of these investigations. Future sub-initiatives, technical milestones and program goals will be presented.

  20. On the SOR method with overlapping subsystems

    Science.gov (United States)

    Maleev, A. A.

    2006-06-01

    A description is given of the iterative Jacobi method with overlapping subsystems and the corresponding Gauss-Seidel method. Similarly to the classical case, a generalized SOR method with overlapping subsystems is constructed by introducing an relaxation parameter. The concept of a ω-consistent matrix is defined. It is shown that, with the optimal choice of the parameter, the theory developed by Young remains valid for ω-consistent matrices. This implies certain results for ω-consistent H-matrices. The theoretical conclusions obtained in the paper are supported by numerical results.

  1. STRATEGI PEMBELAJARAN CREATIVE PROBLEM SOLVING (CPS BERBASIS EKSPERIMEN UNTUK MENINGKATKAN KEMAMPUAN KOGNITIF DAN KETERAMPILAN BERPIKIR KREATIF

    Directory of Open Access Journals (Sweden)

    Ahmad Busyairi

    2015-09-01

    Full Text Available This study aimed to get an idea related to the development of cognitive abilities and creative thinking skills in problem solving student after being given treatment with CPS-based experimental learning and conventional learning. The method used in this research is a quasi-experimental design with the randomized pretest-posttest control group design. The research sample group of 58 high school students who are divided into two classes (class 29 experimental and 29 control group. The collected data was then analyzed using N-gain calculation, t-test, and the calculation of effect size. The result showed that the students' cognitive abilities for both classes equally increased by the moderate category. For the creative thinking skills of students in problem solving, experimental class increased by categories was increased while the control class with low category. Based on the test results show that the application of learning hypothesis-based experiments CPS can significantly improve the cognitive abilities and skills of creative thinking in solving problems of students compared to the application of conventional learning. In addition, based on the calculation of effect size indicates that the application of experiment-based learning CPS effective in improving cognitive ability and creative thinking skills in problem solving students with moderate category. ABSTRAK Penelitian ini bertujuan untuk mendapatkan gambaran terkait peningkatan kemampuan kognitif dan keterampilan berpikir kreatif dalam pemecahan masalah siswa setelah diberikan perlakuan dengan pembelajaran CPS berbasis eksperimen dan pembelajaran kovensional. Metode yang digunakan dalam penelitian ini adalah metode kuasi eksperimen dengan desain the randomized pretest-posttest control group design. Sampel penelitian sebanyak 58 siswa SMA yang dibagi ke dalam dua kelas (29 kelas eksperimen dan 29 kelas kontrol. Data yang terkumpul kemudian dianalisis dengan menggunakan perhitungan N

  2. The charged particle accelerators subsystems modeling

    Science.gov (United States)

    Averyanov, G. P.; Kobylyatskiy, A. V.

    2017-01-01

    Presented web-based resource for information support the engineering, science and education in Electrophysics, containing web-based tools for simulation subsystems charged particle accelerators. Formulated the development motivation of Web-Environment for Virtual Electrophysical Laboratories. Analyzes the trends of designs the dynamic web-environments for supporting of scientific research and E-learning, within the framework of Open Education concept.

  3. Report on the ICES subsystem FLOWS

    NARCIS (Netherlands)

    Booij, N.

    1980-01-01

    FLOWS is a recently developed ICES subsystem for applications in the field of hydraulic engineering. It is able to compute steady as well as unsteady flows in hydraulic networks of given dimensions. This means that is does not optimize any parameters describing the network. The network may be compos

  4. Integrating the autonomous subsystems management process

    Science.gov (United States)

    Ashworth, Barry R.

    1992-01-01

    Ways in which the ranking of the Space Station Module Power Management and Distribution testbed may be achieved and an individual subsystem's internal priorities may be managed within the complete system are examined. The application of these results in the integration and performance leveling of the autonomously managed system is discussed.

  5. Spectroscopic subsystems in nearby wide binaries

    CERN Document Server

    Tokovinin, Andrei

    2015-01-01

    Radial velocity (RV) monitoring of solar-type visual binaries has been conducted at the CTIO/SMARTS 1.5-m telescope to study short-period systems. Data reduction is described, mean and individual RVs of 163 observed objects are given. New spectroscopic binaries are discovered or suspected in 17 objects, for some of them orbital periods could be determined. Subsystems are efficiently detected even in a single observation by double lines and/or by the RV difference between the components of visual binaries. The potential of this detection technique is quantified by simulation and used for statistical assessment of 96 wide binaries within 67pc. It is found that 43 binaries contain at least one subsystem and the occurrence of subsystems is equally probable in either primary or secondary components. The frequency of subsystems and their periods match the simple prescription proposed by the author (2014, AJ, 147, 87). The remaining 53 simple wide binaries with a median projected separation of 1300AU have the distri...

  6. Debris measure subsystem of the nanosatellite IRECIN

    Science.gov (United States)

    Ferrante, M.; di Ciolo, L.; Ortenzi, A.; Petrozzi, M.; del Re, V.

    2003-09-01

    The on board resources, needed to perform the mission tasks, are very limited in nano-satellites. This paper proposes an Electronic real-time system that acquires space debris measures. It uses a piezo-electric sensor. The described device is a subsystem on board of the IRECIN nanosatellite composed mainly by a r.i.s.c. microprocessor, an electronic part that interfaces to the debris sensor in order to provide a low noise electrical and suitable range to ADC 12 bit converter, and finally a memory in order to store the data. The microprocessor handles the Debris Measure System measuring the impacts number, their intensity and storing their waves form. This subsystem is able to communicate with the other IRECIN subsystems through I2C Bus and principally with the "Main Microprocessor" subsystem allowing the data download directly to the Ground Station. Moreover this subsystem lets free the "Main Microprocessor Board" from the management and charge of debris data. All electronic components are SMD technology in order to reduce weight and size. The realized Electronic board are completely developed, realized and tested at the Vitrociset S.P.A. under control of Research and Development Group. The proposed system is implemented on the IRECIN, a modular nanosatellite weighting less than 1.5 kg, constituted by sixteen external sides with surface-mounted solar cells and three internal Al plates, kept together by four steel bars. Lithium-ions batteries are added for eclipse operations. Attitude is determined by two three-axis magnetometers and the solar panels data. Control is provided by an active magnetic control system. The spacecraft will be spin-stabilized with the spin-axis normal to the orbit. debris and micrometeoroids mass and velocity.

  7. The Digital Electronic Subsystem of Marsis

    Science.gov (United States)

    Maltecca, L.; Pecora, M.; Scandelli, L.

    MARSIS (Mars Advanced Radar for Subsurface and Ionospheric Sounding) is one of the Instrument of the ESA Mars Express mission, to be launched in June 2003 with a Soyuz/Fregate. Its primary objective is to map the distribution of water, both liquid and solid, in the upper portions of the crust of Mars. Secondary objectives are subsurface geologic probing, surface characterisation and ionosphere sounding. The MARSIS instrument is a low-frequency nadir-looking pulse limited radar sounder and altimeter with ground penetration capabilities, which uses synthetic aperture techniques and a secondary-receiving antenna to isolate subsurface reflections. Functionally and also from the responsibility point of view of each organisation involved in MARSIS, the instrument can be split into three subsystems: - Antenna: ANT - Radio Frequency Subsystem: RFS (TX+RX) - Digital Electronics Subsystem: DES MARSIS is an international co-operation between Italian Space Agency (ASI) and National Aeronautics and Space Administration (NASA). The experiment has an Italian Principal investigator (from Infocom Dept. of University of Rome "La Sapienza"), an U.S. Co-PI (from Jet Propulsion Laboratory), and Co-I~@~Ys from Italy, the U.S. and other countries. Italy is the lead for the experiment definition with the participation of the U.S.. In particular Alenia Spazio/Rome is the Prime Contractor of the industrial team and also supplier of part of the RF subsystem. Laben (a company of Finmeccanica) is the supplier of the Digital Electronic Subsystem (DES), including its basic and application SW, as subcontractor of ALS. The purpose of this paper is to describe the DES from HW and SW point of view, including the Test Equipment and the special simulator developed used for DES validation.

  8. Total synthesis of a Streptococcus pneumoniae serotype 12F CPS repeating unit hexasaccharide

    Directory of Open Access Journals (Sweden)

    Peter H. Seeberger

    2017-01-01

    Full Text Available The Gram-positive bacterium Streptococcus pneumoniae causes severe disease globally. Vaccines that prevent S. pneumoniae infections induce antibodies against epitopes within the bacterial capsular polysaccharide (CPS. A better immunological understanding of the epitopes that protect from bacterial infection requires defined oligosaccharides obtained by total synthesis. The key to the synthesis of the S. pneumoniae serotype 12F CPS hexasaccharide repeating unit that is not contained in currently used glycoconjugate vaccines is the assembly of the trisaccharide β-D-GalpNAc-(1→4-[α-D-Glcp-(1→3]-β-D-ManpNAcA, in which the branching points are equipped with orthogonal protecting groups. A linear approach relying on the sequential assembly of monosaccharide building blocks proved superior to a convergent [3 + 3] strategy that was not successful due to steric constraints. The synthetic hexasaccharide is the starting point for further immunological investigations.

  9. Total synthesis of a Streptococcus pneumoniae serotype 12F CPS repeating unit hexasaccharide

    Science.gov (United States)

    Pereira, Claney L; Govindan, Subramanian

    2017-01-01

    The Gram-positive bacterium Streptococcus pneumoniae causes severe disease globally. Vaccines that prevent S. pneumoniae infections induce antibodies against epitopes within the bacterial capsular polysaccharide (CPS). A better immunological understanding of the epitopes that protect from bacterial infection requires defined oligosaccharides obtained by total synthesis. The key to the synthesis of the S. pneumoniae serotype 12F CPS hexasaccharide repeating unit that is not contained in currently used glycoconjugate vaccines is the assembly of the trisaccharide β-D-GalpNAc-(1→4)-[α-D-Glcp-(1→3)]-β-D-ManpNAcA, in which the branching points are equipped with orthogonal protecting groups. A linear approach relying on the sequential assembly of monosaccharide building blocks proved superior to a convergent [3 + 3] strategy that was not successful due to steric constraints. The synthetic hexasaccharide is the starting point for further immunological investigations.

  10. Mismatch between the PSB and CPS due to the present vertical recombination scheme

    CERN Document Server

    Jansson, A

    1997-01-01

    The production of the nominal LHC beam will deamand optimum emittance preservation between individual machines in the injection chain. The edge effects at the entry and exit of the bending magnets used for the vertical recombination of the four PS booster rings to the level of the CPS results in a small uncompensated, and for each ring different, mismatch. We present recent measurements of the mismatch done in the PSB measurement line.

  11. Effect of eccentric location of the RBMK CPS displacer graphite block in the shielding sheath

    Energy Technology Data Exchange (ETDEWEB)

    Dostov, A.I. [Russian Research Centre ' ' Kurchatov Institute' ' (Russian Federation)

    2001-07-01

    Temperature conditions and accumulation of Wigner energy in the graphite block of the RBMK reactor CPS (control power system) displacer is examined. It is shown, that at eccentric location of the block in the shielding sheath average temperature of the block drops sharply. Due to the design demerit quantity of the stored energy in the block may be so great, that its release will result in melting of the displacer tube. (author)

  12. Research on pre-processing of QR Code

    Science.gov (United States)

    Sun, Haixing; Xia, Haojie; Dong, Ning

    2013-10-01

    QR code encodes many kinds of information because of its advantages: large storage capacity, high reliability, full arrange of utter-high-speed reading, small printing size and high-efficient representation of Chinese characters, etc. In order to obtain the clearer binarization image from complex background, and improve the recognition rate of QR code, this paper researches on pre-processing methods of QR code (Quick Response Code), and shows algorithms and results of image pre-processing for QR code recognition. Improve the conventional method by changing the Souvola's adaptive text recognition method. Additionally, introduce the QR code Extraction which adapts to different image size, flexible image correction approach, and improve the efficiency and accuracy of QR code image processing.

  13. An Efficient and Configurable Preprocessing Algorithm to Improve Stability Analysis.

    Science.gov (United States)

    Sesia, Ilaria; Cantoni, Elena; Cernigliaro, Alice; Signorile, Giovanna; Fantino, Gianluca; Tavella, Patrizia

    2016-04-01

    The Allan variance (AVAR) is widely used to measure the stability of experimental time series. Specifically, AVAR is commonly used in space applications such as monitoring the clocks of the global navigation satellite systems (GNSSs). In these applications, the experimental data present some peculiar aspects which are not generally encountered when the measurements are carried out in a laboratory. Space clocks' data can in fact present outliers, jumps, and missing values, which corrupt the clock characterization. Therefore, an efficient preprocessing is fundamental to ensure a proper data analysis and improve the stability estimation performed with the AVAR or other similar variances. In this work, we propose a preprocessing algorithm and its implementation in a robust software code (in MATLAB language) able to deal with time series of experimental data affected by nonstationarities and missing data; our method is properly detecting and removing anomalous behaviors, hence making the subsequent stability analysis more reliable.

  14. Adaptive fingerprint image enhancement with emphasis on preprocessing of data.

    Science.gov (United States)

    Bartůnek, Josef Ström; Nilsson, Mikael; Sällberg, Benny; Claesson, Ingvar

    2013-02-01

    This article proposes several improvements to an adaptive fingerprint enhancement method that is based on contextual filtering. The term adaptive implies that parameters of the method are automatically adjusted based on the input fingerprint image. Five processing blocks comprise the adaptive fingerprint enhancement method, where four of these blocks are updated in our proposed system. Hence, the proposed overall system is novel. The four updated processing blocks are: 1) preprocessing; 2) global analysis; 3) local analysis; and 4) matched filtering. In the preprocessing and local analysis blocks, a nonlinear dynamic range adjustment method is used. In the global analysis and matched filtering blocks, different forms of order statistical filters are applied. These processing blocks yield an improved and new adaptive fingerprint image processing method. The performance of the updated processing blocks is presented in the evaluation part of this paper. The algorithm is evaluated toward the NIST developed NBIS software for fingerprint recognition on FVC databases.

  15. Keefektifan Model Kooperatif Tipe Make A Match dan Model CPS Terhadap Kemampuan Pemecahan Masalah dan Motivasi Belajar

    Directory of Open Access Journals (Sweden)

    Nur Fitri Amalia

    2013-12-01

    Full Text Available AbstrakTujuan penelitian ini adalah untuk mengetahui keefektifan model kooperatif tipe Make a Match dan model CPS terhadap kemampuan pemecahan masalah dan motivasi belajar sis-wa kelas X pada materi persamaan dan fungsi kuadrat. Populasi dalam penelitian ini adalah siswa kelas X SMA N 1 Subah tahun ajaran 2013/2014. Sampel dalam penelitian ini diam-bil dengan teknik random sampling, yaitu teknik pengambilan sampel dengan acak. Kelas X8 terpilih sebagai kelas eksperimen I dengan penerapan model kooperatif tipe Make a Match dan kelas X7 terpilih sebagai kelas eksperimen II dengan penerapan model CPS. Da-ta hasil penelitian diperoleh dengan tes dan pemberian angket untuk kemudian dianalisis menggunakan uji proporsi dan uji t. Hasil penelitian adalah (1 implementasi model koope-ratif tipe Make a Match efektif terhadap kemampuan pemecahan masalah; (2 implementasi model CPS efektif terhadap kemampuan pemecahan masalah; (3 implementasi model koo-peratif tipe Make a Match lebih baik daripada model CPS terhadap kemampuan pecahan masalah; (4 implementasi model CPS lebih baik daripada model kooperatif tipe Make a Match terhadap motivasi belajar.Kata Kunci:       Make A Match; CPS; Pemecahan Masalah; Motivasi  AbstractThe purpose of this study was to determine the effectiveness of cooperative models Make a Match and CPS to problem-solving ability and motivation of students of class X in the equation of matter and quadratic function. The population of this study was the tenth grade students of state senior high school 1 Subah academic year 2013/2014. The samples in this study were taken by random sampling technique, that is sampling techniques with random. Class X8 was selected as the experimental class I with the application of cooperative model make a Match and class X7 was selected as the experimental class II with the application of the CPS. The data were obtained with the administration of a questionnaire to test and then analyzed using the

  16. Two novel CPs with double helical chains based rigid tripodal ligands: Syntheses, crystal structures, magnetic susceptibility and fluorescence properties

    Science.gov (United States)

    Wang, Xiao; Hou, Xiang-Yang; Zhai, Quan-Guo; Hu, Man-Cheng

    2016-11-01

    Two three-dimensional coordination polymers (CPs), namely [Cd(bpydb)- (H2bpydb)]n·0.5nH2O (1), and [Cu2(bpydb)2]n (2) (2,6-di-p-carboxyphenyl-4,4'- bipyridine1 = H2bpydb), containing a novel double-helical chains, which have been solvothermal synthesized, characterized, and structure determination. CPs 1-2 reveal the new (3,5)-net and (3,6)-net alb topology, respectively. The fluorescence properties of CPs 1-2 were investigated, and magnetic susceptibility measurements indicate that compound 1 has dominating antiferromagnetic couplings between metal ions.

  17. Linguistic Preprocessing and Tagging for Problem Report Trend Analysis

    Science.gov (United States)

    Beil, Robert J.; Malin, Jane T.

    2012-01-01

    Mr. Robert Beil, Systems Engineer at Kennedy Space Center (KSC), requested the NASA Engineering and Safety Center (NESC) develop a prototype tool suite that combines complementary software technology used at Johnson Space Center (JSC) and KSC for problem report preprocessing and semantic tag extraction, to improve input to data mining and trend analysis. This document contains the outcome of the assessment and the Findings, Observations and NESC Recommendations.

  18. Research on Digital Watermark Using Pre-Processing Technology

    Institute of Scientific and Technical Information of China (English)

    Ru Guo-bao; Ru Guo-bao; Niu Hui-fang; Niu Hui-fang; Yang Rui; Yang Rui; Sun Hong; Sun Hong; Shi Hong-ling; Shi Hong-ling; Huang Tian-xi; Huang Tian-xi

    2003-01-01

    We have realized a watermark embedding system based on audio perceptual masking and brought forward a watermark detection system using pre-processing technology.We can detect watermark from watermarked audio without original audio by using this method. The results have indicated that this embedding and detecting method is robust, on the premise of not affecting the hearing quality, it can resist those attacks such as MPEG compressing, filtering and adding white noise.

  19. Biosignal data preprocessing: a voice pathology detection application

    Directory of Open Access Journals (Sweden)

    Genaro Daza Santacoloma

    2010-05-01

    Full Text Available A methodology for biosignal data preprocessing is presented. Experiments were mainly carried out with voice signals for automa- tically detecting pathologies. The proposed methodology was structured on 3 elements: outlier detection, normality verification and distribution transformation. It improved classification performance if basic assumptions about data structure were met. This entailed a more accurate detection of voice pathologies and it reduced the computational complexity of classification algorithms. Classification performance improved by 15%.

  20. Integration of geometric modeling and advanced finite element preprocessing

    Science.gov (United States)

    Shephard, Mark S.; Finnigan, Peter M.

    1987-01-01

    The structure to a geometry based finite element preprocessing system is presented. The key features of the system are the use of geometric operators to support all geometric calculations required for analysis model generation, and the use of a hierarchic boundary based data structure for the major data sets within the system. The approach presented can support the finite element modeling procedures used today as well as the fully automated procedures under development.

  1. Balance Analysis of Microstrip-to-CPS Baluns and Its Effects on Broadband Antenna Performance

    Directory of Open Access Journals (Sweden)

    Dong Sik Woo

    2013-01-01

    Full Text Available Amplitude and phase balances of two types of microstrip-(MS- to-coplanar stripline (CPS baluns have been analyzed through simulations and measurements, and their effects on broadband antenna performance are investigated. The impedance bandwidth of the balun determined by a back-to-back configuration can sometimes overestimate the balun operating bandwidth. With the conventional balun with a 180° phase delay line, it is observed that the balun balance over the operating frequencies becomes much more improved as the CPS length increases to over 0.1 λg. As compared with the conventional balun, the proposed MS-to-CPS balun demonstrated very wideband performance from 5 to over 20 GHz. With the proposed balun, amplitude and phase imbalances are within 1 dB and ±5°, respectively. Effects of the balun imbalance on overall broadband antenna performance are also discussed with a quasi-Yagi antenna and a narrow beamwidth tapered slot antenna (TSA.

  2. Medium-Chain Chlorinated Paraffins (CPs) Dominate in Australian Sewage Sludge.

    Science.gov (United States)

    Brandsma, Sicco H; van Mourik, Louise; O'Brien, Jake W; Eaglesham, Geoff; Leonards, Pim E G; de Boer, Jacob; Gallen, Christie; Mueller, Jochen; Gaus, Caroline; Bogdal, Christian

    2017-03-02

    To simultaneously quantify and profile the complex mixture of short-, median-, and long-chain CPs (SCCPs, MCCPs, and LCCPs) in Australian sewage sludge, we applied and further validated a recently developed novel instrumental technique, using quadrupole time-of-flight high resolution mass spectrometry running in the negative atmospheric pressure chemical ionization mode (APCI-qTOF-HRMS). Without using an analytical column the cleaned extracts were directly injected into the qTOF-HRMS followed by quantification of the CPs by a mathematical algorithm. The recoveries of the four SCCP, MCCP and LCCP-spiked sewage sludge samples ranged from 86 to 123%. This APCI-qTOF-HRMS method is a fast and promising technique for routinely measuring SCCPs, MCCPs, and LCCPs in sewage sludge. Australian sewage sludge was dominated by MCCPs with concentrations ranging from 542 to 3645 ng/g dry weight (dw). Lower SCCPs concentrations (<57-1421 ng/g dw) were detected in the Australian sewage sludge, which were comparable with the LCCPs concentrations (116-960 ng/g dw). This is the first time that CPs were reported in Australian sewage sludge. The results of this study gives a first impression on the distribution of the SCCPs, MCCPs, and LCCPs in Australia wastewater treatment plants (WWTPs).

  3. Development of the Childbirth Perception Scale (CPS): perception of delivery and the first postpartum week.

    Science.gov (United States)

    Truijens, Sophie E M; Wijnen, Hennie A; Pommer, Antoinette M; Oei, S Guid; Pop, Victor J M

    2014-10-01

    Some caregivers suggest a more positive experience of childbirth when giving birth at home. Since properly developed instruments that assess women's perception of delivery and the early postpartum are missing, the aim of the current study is to develop a Childbirth Perception Scale (CPS). Three focus groups with caregivers, pregnant women, and women who recently gave birth were conducted. Psychometric properties of 23 candidate items derived from the interviews were tested with explorative factor analysis (EFA) (N = 495). Confirmatory factor analysis (CFA) was performed in another sample of women (N = 483) and confirmed a 12-item CPS. The EFA in sample I suggested a two-component solution: a subscale 'perception of delivery' (six items) and a subscale 'perception of the first postpartum week' (six items). The CFA in sample II confirmed an adequate model fit and a good internal consistency (α = .82). Multivariate linear regression showed a positive effect of home delivery on perception of delivery in multiparous but not in primiparous women. The 12-item CPS with two dimensions (perception of delivery and perception of first postpartum week) has adequate psychometric properties. In multiparous women, home delivery showed to be independently related to more positive perception of delivery.

  4. Study on perception and control layer of mine CPS with mixed logic dynamic approach

    Science.gov (United States)

    Li, Jingzhao; Ren, Ping; Yang, Dayu

    2017-01-01

    Mine inclined roadway transportation system of mine cyber physical system is a hybrid system consisting of a continuous-time system and a discrete-time system, which can be divided into inclined roadway signal subsystem, error-proofing channel subsystems, anti-car subsystems, and frequency control subsystems. First, to ensure stable operation, improve efficiency and production safety, this hybrid system model with n inputs and m outputs is constructed and analyzed in detail, then its steady schedule state to be solved. Second, on the basis of the formal modeling for real-time systems, we use hybrid toolbox for system security verification. Third, the practical application of mine cyber physical system shows that the method for real-time simulation of mine cyber physical system is effective.

  5. Review of feed forward neural network classification preprocessing techniques

    Science.gov (United States)

    Asadi, Roya; Kareem, Sameem Abdul

    2014-06-01

    The best feature of artificial intelligent Feed Forward Neural Network (FFNN) classification models is learning of input data through their weights. Data preprocessing and pre-training are the contributing factors in developing efficient techniques for low training time and high accuracy of classification. In this study, we investigate and review the powerful preprocessing functions of the FFNN models. Currently initialization of the weights is at random which is the main source of problems. Multilayer auto-encoder networks as the latest technique like other related techniques is unable to solve the problems. Weight Linear Analysis (WLA) is a combination of data pre-processing and pre-training to generate real weights through the use of normalized input values. The FFNN model by using the WLA increases classification accuracy and improve training time in a single epoch without any training cycle, the gradient of the mean square error function, updating the weights. The results of comparison and evaluation show that the WLA is a powerful technique in the FFNN classification area yet.

  6. A Survey on Preprocessing Methods for Web Usage Data

    Directory of Open Access Journals (Sweden)

    V.Chitraa

    2010-03-01

    Full Text Available World Wide Web is a huge repository of web pages and links. It provides abundance of information for the Internet users. The growth of web is tremendous as approximately one million pages are added daily. Users’ accesses are recorded in web logs. Because of the tremendous usage of web, the web log files are growing at a faster rate and the size is becoming huge. Web data mining is the application of data mining techniques in web data. Web Usage Mining applies mining techniques in log data to extract the behavior of users which is used in various applications like personalized services, adaptive web sites, customer profiling, prefetching, creating attractive web sites etc., Web usage mining consists of three phases preprocessing, pattern discovery and pattern analysis. Web log data is usually noisy and ambiguous and preprocessing is an important process before mining. For discovering patterns sessions are to be constructed efficiently. This paper reviews existing work done in the preprocessing stage. A brief overview of various data mining techniques for discovering patterns, and pattern analysis are discussed. Finally a glimpse of various applications of web usage mining is also presented.

  7. Optimization of miRNA-seq data preprocessing.

    Science.gov (United States)

    Tam, Shirley; Tsao, Ming-Sound; McPherson, John D

    2015-11-01

    The past two decades of microRNA (miRNA) research has solidified the role of these small non-coding RNAs as key regulators of many biological processes and promising biomarkers for disease. The concurrent development in high-throughput profiling technology has further advanced our understanding of the impact of their dysregulation on a global scale. Currently, next-generation sequencing is the platform of choice for the discovery and quantification of miRNAs. Despite this, there is no clear consensus on how the data should be preprocessed before conducting downstream analyses. Often overlooked, data preprocessing is an essential step in data analysis: the presence of unreliable features and noise can affect the conclusions drawn from downstream analyses. Using a spike-in dilution study, we evaluated the effects of several general-purpose aligners (BWA, Bowtie, Bowtie 2 and Novoalign), and normalization methods (counts-per-million, total count scaling, upper quartile scaling, Trimmed Mean of M, DESeq, linear regression, cyclic loess and quantile) with respect to the final miRNA count data distribution, variance, bias and accuracy of differential expression analysis. We make practical recommendations on the optimal preprocessing methods for the extraction and interpretation of miRNA count data from small RNA-sequencing experiments.

  8. A Stereo Music Preprocessing Scheme for Cochlear Implant Users.

    Science.gov (United States)

    Buyens, Wim; van Dijk, Bas; Wouters, Jan; Moonen, Marc

    2015-10-01

    Listening to music is still one of the more challenging aspects of using a cochlear implant (CI) for most users. Simple musical structures, a clear rhythm/beat, and lyrics that are easy to follow are among the top factors contributing to music appreciation for CI users. Modifying the audio mix of complex music potentially improves music enjoyment in CI users. A stereo music preprocessing scheme is described in which vocals, drums, and bass are emphasized based on the representation of the harmonic and the percussive components in the input spectrogram, combined with the spatial allocation of instruments in typical stereo recordings. The scheme is assessed with postlingually deafened CI subjects (N = 7) using pop/rock music excerpts with different complexity levels. The scheme is capable of modifying relative instrument level settings, with the aim of improving music appreciation in CI users, and allows individual preference adjustments. The assessment with CI subjects confirms the preference for more emphasis on vocals, drums, and bass as offered by the preprocessing scheme, especially for songs with higher complexity. The stereo music preprocessing scheme has the potential to improve music enjoyment in CI users by modifying the audio mix in widespread (stereo) music recordings. Since music enjoyment in CI users is generally poor, this scheme can assist the music listening experience of CI users as a training or rehabilitation tool.

  9. Panel summary of cyber-physical systems (CPS) and Internet of Things (IoT) opportunities with information fusion

    Science.gov (United States)

    Blasch, Erik; Kadar, Ivan; Grewe, Lynne L.; Brooks, Richard; Yu, Wei; Kwasinski, Andres; Thomopoulos, Stelios; Salerno, John; Qi, Hairong

    2017-05-01

    During the 2016 SPIE DSS conference, nine panelists were invited to highlight the trends and opportunities in cyber-physical systems (CPS) and Internet of Things (IoT) with information fusion. The world will be ubiquitously outfitted with many sensors to support our daily living thorough the Internet of Things (IoT), manage infrastructure developments with cyber-physical systems (CPS), as well as provide communication through networked information fusion technology over the internet (NIFTI). This paper summarizes the panel discussions on opportunities of information fusion to the growing trends in CPS and IoT. The summary includes the concepts and areas where information supports these CPS/IoT which includes situation awareness, transportation, and smart grids.

  10. Partitioning a macroscopic system into independent subsystems

    Science.gov (United States)

    Delle Site, Luigi; Ciccotti, Giovanni; Hartmann, Carsten

    2017-08-01

    We discuss the problem of partitioning a macroscopic system into a collection of independent subsystems. The partitioning of a system into replica-like subsystems is nowadays a subject of major interest in several fields of theoretical and applied physics. The thermodynamic approach currently favoured by practitioners is based on a phenomenological definition of an interface energy associated with the partition, due to a lack of easily computable expressions for a microscopic (i.e. particle-based) interface energy. In this article, we outline a general approach to derive sharp and computable bounds for the interface free energy in terms of microscopic statistical quantities. We discuss potential applications in nanothermodynamics and outline possible future directions.

  11. Controls Interfaces for Two ALICE Subsystems

    Science.gov (United States)

    Thomen, Robert

    2007-10-01

    Software for the control of a laser alignment system for the Inner Tacking System (ITS) and for the Electromagnetic Calorimeter (EMC) was developed for the ALICE (A Large Ion Collider Experiment) at CERN. The interfaces for both subsystems use the CERN-standard hardware controls system PVSS (Prozessvisualisierungs- und Steuerungs-System). Software for the ITS has been created to measure the relative alignment of the ITS with the Time Projection Chamber (TPC) so to ensure accurate particle tracking. The ITS alignment system locates laser images in four cameras. The EMC requires several subsystems to be running in order to operate properly. Software has been created and tested for the detector's high and low voltage systems, and temperature monitoring hardware. The ITS and EMC software specifications and design requirements are presented and their performance is analyzed.

  12. Analysis of the human operator subsystems

    Science.gov (United States)

    Jones, Lynette A.; Hunter, Ian W.

    1991-01-01

    Except in low-bandwidth systems, knowledge of the human operator transfer function is essential for high-performance telerobotic systems. This information has usually been derived from detailed analyses of tracking performance, in which the human operator is considered as a complete system rather than as a summation of a number of subsystems, each of which influences the operator's output. Studies of one of these subsystems, the limb mechanics system, demonstrate that large parameter variations can occur that can have a profound effect on the stability of force-reflecting telerobot systems. An objective of this research was to decompose the performance of the human operator system in order to establish how the dynamics of each of the elements influence the operator's responses.

  13. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea

    Energy Technology Data Exchange (ETDEWEB)

    Palit, Mousumi [Department of Electronics and Telecommunication Engineering, Central Calcutta Polytechnic, Kolkata 700014 (India); Tudu, Bipan, E-mail: bt@iee.jusl.ac.in [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Bhattacharyya, Nabarun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Dutta, Ankur; Dutta, Pallab Kumar [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Jana, Arun [Centre for Development of Advanced Computing, Kolkata 700091 (India); Bandyopadhyay, Rajib [Department of Instrumentation and Electronics Engineering, Jadavpur University, Kolkata 700098 (India); Chatterjee, Anutosh [Department of Electronics and Communication Engineering, Heritage Institute of Technology, Kolkata 700107 (India)

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  14. Comparison of multivariate preprocessing techniques as applied to electronic tongue based pattern classification for black tea.

    Science.gov (United States)

    Palit, Mousumi; Tudu, Bipan; Bhattacharyya, Nabarun; Dutta, Ankur; Dutta, Pallab Kumar; Jana, Arun; Bandyopadhyay, Rajib; Chatterjee, Anutosh

    2010-08-18

    In an electronic tongue, preprocessing on raw data precedes pattern analysis and choice of the appropriate preprocessing technique is crucial for the performance of the pattern classifier. While attempting to classify different grades of black tea using a voltammetric electronic tongue, different preprocessing techniques have been explored and a comparison of their performances is presented in this paper. The preprocessing techniques are compared first by a quantitative measurement of separability followed by principle component analysis; and then two different supervised pattern recognition models based on neural networks are used to evaluate the performance of the preprocessing techniques.

  15. Optical Subsystems for Next Generation Access Networks

    DEFF Research Database (Denmark)

    Lazaro, J.A; Polo, V.; Schrenk, B.

    2011-01-01

    Recent optical technologies are providing higher flexibility to next generation access networks: on the one hand, providing progressive FTTx and specifically FTTH deployment, progressively shortening the copper access network; on the other hand, also opening fixed-mobile convergence solutions...... in next generation PON architectures. It is provided an overview of the optical subsystems developed for the implementation of the proposed NG-Access Networks....

  16. Liquid radioactive waste subsystem design description

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1986-06-01

    The Liquid Radioactive Waste Subsystem provides a reliable system to safely control liquid waste radiation and to collect, process, and dispose of all radioactive liquid waste without impairing plant operation. Liquid waste is stored in radwaste receiver tanks and is processed through demineralizers and temporarily stored in test tanks prior to sampling and discharge. Radwastes unsuitable for discharge are transferred to the Solid Radwaste System.

  17. Optical Subsystems for Next Generation Access Networks

    DEFF Research Database (Denmark)

    Lazaro, J.A; Polo, V.; Schrenk, B.

    2011-01-01

    Recent optical technologies are providing higher flexibility to next generation access networks: on the one hand, providing progressive FTTx and specifically FTTH deployment, progressively shortening the copper access network; on the other hand, also opening fixed-mobile convergence solutions...... in next generation PON architectures. It is provided an overview of the optical subsystems developed for the implementation of the proposed NG-Access Networks....

  18. Subsystem codes with spatially local generators

    CERN Document Server

    Bravyi, Sergey

    2010-01-01

    We study subsystem codes whose gauge group has local generators in the 2D geometry. It is shown that there exists a family of such codes defined on lattices of size LxL with the number of logical qubits k and the minimum distance d both proportional to L. The gauge group of these codes involves only two-qubit generators of type XX and ZZ coupling nearest neighbor qubits (and some auxiliary one-qubit generators). Our proof is not constructive as it relies on a certain version of the Gilbert-Varshamov bound for classical codes. Along the way we introduce and study properties of generalized Bacon-Shor codes which might be of independent interest. Secondly, we prove that any 2D subsystem [n,k,d] code with spatially local generators obeys upper bounds kd=O(n) and d^2=O(n). The analogous upper bound proved recently for 2D stabilizer codes is kd^2=O(n). Our results thus demonstrate that subsystem codes can be more powerful than stabilizer codes under the spatial locality constraint.

  19. Singularly perturbed bifurcation subsystem and its application in power systems

    Institute of Scientific and Technical Information of China (English)

    An Yichun; Zhang Qingling; Zhu Yukun; Zhang Yan

    2008-01-01

    The singularly perturbed bifurcation subsystem is described,and the test conditions of subsystem persistence are deduced.By use of fast and slow reduced subsystem model,the result does not require performing nonlinear transformation.Moreover,it is shown and proved that the persistence of the periodic orbits for Hopf bifurcation in the reduced model through center manifold.Van der Pol oscillator circuit is given to illustrate the persistence of bifurcation subsystems with the full dynamic system.

  20. Preprocessing Techniques for High-Efficiency Data Compression in Wireless Multimedia Sensor Networks

    Directory of Open Access Journals (Sweden)

    Junho Park

    2015-01-01

    Full Text Available We have proposed preprocessing techniques for high-efficiency data compression in wireless multimedia sensor networks. To do this, we analyzed the characteristics of multimedia data under the environment of wireless multimedia sensor networks. The proposed preprocessing techniques consider the characteristics of sensed multimedia data to perform the first stage preprocessing by deleting the low priority bits that do not affect the image quality. The second stage preprocessing is also performed for the undeleted high priority bits. By performing these two-stage preprocessing techniques, it is possible to reduce the multimedia data size in large. To show the superiority of our techniques, we simulated the existing multimedia data compression scheme with/without our preprocessing techniques. Our experimental results show that our proposed techniques increase compression ratio while reducing compression operations compared to the existing compression scheme without preprocessing techniques.

  1. Thermal energy storage subsystems (a collection of quarterly reports)

    Energy Technology Data Exchange (ETDEWEB)

    1978-01-01

    Five quarterly reports are presented, covering the progress made in the development, fabrication, and delivery of three Thermal Energy Storage Subsystems. The design, development, and progress toward the delivery of three subsystems are discussed. The subsystem uses a salt hydrate mixture for thermal energy storage. Included are the program schedules, technical data, and other program activities from October 1, 1976 through December 31, 1977.

  2. Preprocessing and parameterizing bioimpedance spectroscopy measurements by singular value decomposition.

    Science.gov (United States)

    Nejadgholi, Isar; Caytak, Herschel; Bolic, Miodrag; Batkin, Izmail; Shirmohammadi, Shervin

    2015-05-01

    In several applications of bioimpedance spectroscopy, the measured spectrum is parameterized by being fitted into the Cole equation. However, the extracted Cole parameters seem to be inconsistent from one measurement session to another, which leads to a high standard deviation of extracted parameters. This inconsistency is modeled with a source of random variations added to the voltage measurement carried out in the time domain. These random variations may originate from biological variations that are irrelevant to the evidence that we are investigating. Yet, they affect the voltage measured by using a bioimpedance device based on which magnitude and phase of impedance are calculated.By means of simulated data, we showed that Cole parameters are highly affected by this type of variation. We further showed that singular value decomposition (SVD) is an effective tool for parameterizing bioimpedance measurements, which results in more consistent parameters than Cole parameters. We propose to apply SVD as a preprocessing method to reconstruct denoised bioimpedance measurements. In order to evaluate the method, we calculated the relative difference between parameters extracted from noisy and clean simulated bioimpedance spectra. Both mean and standard deviation of this relative difference are shown to effectively decrease when Cole parameters are extracted from preprocessed data in comparison to being extracted from raw measurements.We evaluated the performance of the proposed method in distinguishing three arm positions, for a set of experiments including eight subjects. It is shown that Cole parameters of different positions are not distinguishable when extracted from raw measurements. However, one arm position can be distinguished based on SVD scores. Moreover, all three positions are shown to be distinguished by two parameters, R0/R∞ and Fc, when Cole parameters are extracted from preprocessed measurements. These results suggest that SVD could be considered as an

  3. Contour extraction of echocardiographic images based on pre-processing

    Energy Technology Data Exchange (ETDEWEB)

    Hussein, Zinah Rajab; Rahmat, Rahmita Wirza; Abdullah, Lili Nurliyana [Department of Multimedia, Faculty of Computer Science and Information Technology, Department of Computer and Communication Systems Engineering, Faculty of Engineering University Putra Malaysia 43400 Serdang, Selangor (Malaysia); Zamrin, D M [Department of Surgery, Faculty of Medicine, National University of Malaysia, 56000 Cheras, Kuala Lumpur (Malaysia); Saripan, M Iqbal

    2011-02-15

    In this work we present a technique to extract the heart contours from noisy echocardiograph images. Our technique is based on improving the image before applying contours detection to reduce heavy noise and get better image quality. To perform that, we combine many pre-processing techniques (filtering, morphological operations, and contrast adjustment) to avoid unclear edges and enhance low contrast of echocardiograph images, after implementing these techniques we can get legible detection for heart boundaries and valves movement by traditional edge detection methods.

  4. Preprocessing and Analysis of LC-MS-Based Proteomic Data.

    Science.gov (United States)

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed.

  5. Effects of preprocessing Landsat MSS data on derived features

    Science.gov (United States)

    Parris, T. M.; Cicone, R. C.

    1983-01-01

    Important to the use of multitemporal Landsat MSS data for earth resources monitoring, such as agricultural inventories, is the ability to minimize the effects of varying atmospheric and satellite viewing conditions, while extracting physically meaningful features from the data. In general, the approaches to the preprocessing problem have been derived from either physical or statistical models. This paper compares three proposed algorithms; XSTAR haze correction, Color Normalization, and Multiple Acquisition Mean Level Adjustment. These techniques represent physical, statistical, and hybrid physical-statistical models, respectively. The comparisons are made in the context of three feature extraction techniques; the Tasseled Cap, the Cate Color Cube. and Normalized Difference.

  6. Space-reactor electric systems: subsystem technology assessment

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, R.V.; Bost, D.; Determan, W.R.

    1983-03-29

    This report documents the subsystem technology assessment. For the purpose of this report, five subsystems were defined for a space reactor electric system, and the report is organized around these subsystems: reactor; shielding; primary heat transport; power conversion and processing; and heat rejection. The purpose of the assessment was to determine the current technology status and the technology potentials for different types of the five subsystems. The cost and schedule needed to develop these potentials were estimated, and sets of development-compatible subsystems were identified.

  7. Ultra-Wideband Fermi Antenna Using Microstrip-to-CPS Balun

    Science.gov (United States)

    Woo, Dong-Sik; Kim, Young-Gon; Cho, Young-Ki; Kim, Kang Wook

    A new design and experimental results of a microstrip-fed ultra-wideband Fermi antenna at millimeter-wave frequencies are presented. By utilizing a new microstrip-to-CPS balun (or transition), which provides wider bandwidth than conventional planar balun, the design of microstrip-fed Fermi antenna is greatly simplified. The proposed Fermi antenna demonstrates ultra-wideband performance for the frequency range of 23 to over 58GHz with the antenna gain of 12 to 14dBi and low sidelobe levels. This design yields highly effective solutions to various millimeter-wave phased-arrays and imaging systems.

  8. Design on SRIO Bus Switch Module Based on CPS1848%基于CPS1848的SRIO总线交换模块设计

    Institute of Scientific and Technical Information of China (English)

    马友科

    2014-01-01

    随着处理器运算能力的不断提高,处理器之间的数据传输交换成为了制约系统性能的关键因素之一,在采用SRIO总线的数字信号并行处理系统中, SRIO 总线交换模块就显得很重要。为了满足对高速数据交换的需求,基于CPS1848芯片,采用高性能的电源模块和时钟模块,设计了一种SRIO总线交换模块。通过DSP 与FPGA之间的数据传输实验,验证了SRIO交换模块进行数据传输的性能,并分析了实测值与理论值存在差异的原因,为高速信号处理平台的设计研制提供了技术支撑,也为信号处理方案设计提供了参考依据。%As the computation capability of chip is improved,the data transfer and switch bandwidth between the chips become one of the key factors that restrict the system performance.So SRIO bus switch module is very important in the digital signal parallel pro-cessing system using SRIO bus.Based on CPS1848,a SRIO switch module is designed by using the high performance power module and clock module.The experiment of data transfer between the DSP and FPGA validates the data transfer performance of SRIO switch mod-ule,and analyzes the difference between the measured value and theoretical value.The experimental results provide a technical support for the development of high-speed signal processing device and a reference for the signal process scheme design.

  9. Optical fiber telecommunications components and subsystems

    CERN Document Server

    Kaminow, Ivan; Willner, Alan E

    2013-01-01

    Optical Fiber Telecommunications VI (A&B) is the sixth in a series that has chronicled the progress in the R&D of lightwave communications since the early 1970s. Written by active authorities from academia and industry, this edition brings a fresh look to many essential topics, including devices, subsystems, systems and networks. A central theme is the enabling of high-bandwidth communications in a cost-effective manner for the development of customer applications. These volumes are an ideal reference for R&D engineers and managers, optical systems implementers, university researchers and s

  10. Global GEO survey subsystem of the ISON

    Science.gov (United States)

    Molotov, Igor; Agapov, Vladimir; Rumyantsev, Vasiliy; Biryukov, Vadim; Kornienko, Gennadiy; Litvinenko, Elena; Vikhristenko, Alexander; Zalles, Rodolfo; Guseva, Irina; Inasaridze, Raguli

    Dedicated subsystem of the International Scientific Optical Network (ISON) is created in order to provide the regular monitoring of the objects brighter than 15m at GEO region. For GEO longitudes between 31.5W and 90E a survey mode is implemented in a zone of ±16° width with respect to equator, for longitudes between 90E and 210W the selective areas are surveyed regularly in a zone of ±10° width, and for longitudes between 135W and 1 E the periodic observations are arranged in a zone of ±1° width. Initially existing astrographs of 23 cm and 40 cm aperture with FOV of 30'x30' were involved into the work. Then the dedicated 22-cm aperture telescope installed on the automated mount was elaborated. This new telescope having FOV of 4° x4° can provide up to 5000 measurements for around 400 GEO and HEO (mainly GTO) objects per night. Currently the six similar 22-cm aperture telescopes at different stages of operations are working in Tiraspol (Pridnestrovie), Nauchniy (Crimea, Ukraine), Pulkovo (St.-Petersburg, Russia), Kitab (Uzbekistan) and Ussuriysk (Far East, Russia) observation facilities. During the year of 2008 similar telescopes will be installed in Abastumani (Georgia), Milkovo (Kamchatka, Russia), Tarija (Bolivia), Blagoveshchensk (Far East, Russia) and Gissar (Tadjikistan). Since 2009 the subsystem will provide surveying capability for the GEO region in global scale both by longitude (0° -360° ) and inclination (0° -20° ). The trial operations of the first fully automated 22-cm telescope during 2007 (the part of GEO arc between 31.5W and 90E was surveyed about 60 times) showed existing large gap in our knowledge of space debris populations at GEO region. Few hundreds of uncatalogued respectively bright objects are detected at GEO and GTO orbits. Special processing of obtained short arc tracks for non-correlated object allowed to correlate some of them and thus to discover around 40 new objects. During 2007 slightly less than 200000 measurements

  11. Subsystem's dynamics under random Hamiltonian evolution

    CERN Document Server

    Vinayak,

    2011-01-01

    We study time evolution of a subsystem's density matrix under a unitary evolution, generated by a sufficiently complex, say quantum chaotic, Hamiltonian. We exactly calculate all coherences, purity and fluctuations. The reduced density matrix is described in terms of a noncentral correlated Wishart ensemble. Our description accounts for a transition from an arbitrary initial state towards a random state at large times, enabling us to determine the convergence time after which random states are reached. We identify and describe a number of other interesting features, like a series of collisions between the largest eigenvalue and the bulk, accompanied by a phase transition in its distribution function.

  12. Network Analysis of Metabolite GWAS Hits: Implication of CPS1 and the Urea Cycle in Weight Maintenance.

    Directory of Open Access Journals (Sweden)

    Alice Matone

    Full Text Available Weight loss success is dependent on the ability to refrain from regaining the lost weight in time. This feature was shown to be largely variable among individuals, and these differences, with their underlying molecular processes, are diverse and not completely elucidated. Altered plasma metabolites concentration could partly explain weight loss maintenance mechanisms. In the present work, a systems biology approach has been applied to investigate the potential mechanisms involved in weight loss maintenance within the Diogenes weight-loss intervention study.A genome wide association study identified SNPs associated with plasma glycine levels within the CPS1 (Carbamoyl-Phosphate Synthase 1 gene (rs10206976, p-value = 4.709e-11 and rs12613336, p-value = 1.368e-08. Furthermore, gene expression in the adipose tissue showed that CPS1 expression levels were associated with successful weight maintenance and with several SNPs within CPS1 (cis-eQTL. In order to contextualize these results, a gene-metabolite interaction network of CPS1 and glycine has been built and analyzed, showing functional enrichment in genes involved in lipid metabolism and one carbon pool by folate pathways.CPS1 is the rate-limiting enzyme for the urea cycle, catalyzing carbamoyl phosphate from ammonia and bicarbonate in the mitochondria. Glycine and CPS1 are connected through the one-carbon pool by the folate pathway and the urea cycle. Furthermore, glycine could be linked to metabolic health and insulin sensitivity through the betaine osmolyte. These considerations, and the results from the present study, highlight a possible role of CPS1 and related pathways in weight loss maintenance, suggesting that it might be partly genetically determined in humans.

  13. Network Analysis of Metabolite GWAS Hits: Implication of CPS1 and the Urea Cycle in Weight Maintenance

    Science.gov (United States)

    Carayol, Jerome; Fazelzadeh, Parastoo; Lefebvre, Gregory; Valsesia, Armand; Charon, Celine; Vervoort, Jacques; Astrup, Arne; Saris, Wim H. M.; Morine, Melissa; Hager, Jörg

    2016-01-01

    Background and Scope Weight loss success is dependent on the ability to refrain from regaining the lost weight in time. This feature was shown to be largely variable among individuals, and these differences, with their underlying molecular processes, are diverse and not completely elucidated. Altered plasma metabolites concentration could partly explain weight loss maintenance mechanisms. In the present work, a systems biology approach has been applied to investigate the potential mechanisms involved in weight loss maintenance within the Diogenes weight-loss intervention study. Methods and Results A genome wide association study identified SNPs associated with plasma glycine levels within the CPS1 (Carbamoyl-Phosphate Synthase 1) gene (rs10206976, p-value = 4.709e-11 and rs12613336, p-value = 1.368e-08). Furthermore, gene expression in the adipose tissue showed that CPS1 expression levels were associated with successful weight maintenance and with several SNPs within CPS1 (cis-eQTL). In order to contextualize these results, a gene-metabolite interaction network of CPS1 and glycine has been built and analyzed, showing functional enrichment in genes involved in lipid metabolism and one carbon pool by folate pathways. Conclusions CPS1 is the rate-limiting enzyme for the urea cycle, catalyzing carbamoyl phosphate from ammonia and bicarbonate in the mitochondria. Glycine and CPS1 are connected through the one-carbon pool by the folate pathway and the urea cycle. Furthermore, glycine could be linked to metabolic health and insulin sensitivity through the betaine osmolyte. These considerations, and the results from the present study, highlight a possible role of CPS1 and related pathways in weight loss maintenance, suggesting that it might be partly genetically determined in humans. PMID:26938218

  14. Preprocessing of GPR data for syntactic landmine detection and classification

    Science.gov (United States)

    Nasif, Ahmed O.; Hintz, Kenneth J.; Peixoto, Nathalia

    2010-04-01

    Syntactic pattern recognition is being used to detect and classify non-metallic landmines in terms of their range impedance discontinuity profile. This profile, extracted from the ground penetrating radar's return signal, constitutes a high-range-resolution and unique description of the inner structure of a landmine. In this paper, we discuss two preprocessing steps necessary to extract such a profile, namely, inverse filtering (deconvolving) and binarization. We validate the use of an inverse filter to effectively decompose the observed composite signal resulting from the different layers of dielectric materials of a landmine. It is demonstrated that the transmitted radar waveform undergoing multiple reflections with different materials does not change appreciably, and mainly depends on the transmit and receive processing chains of the particular radar being used. Then, a new inversion approach for the inverse filter is presented based on the cumulative contribution of the different frequency components to the original Fourier spectrum. We discuss the tradeoffs and challenges involved in such a filter design. The purpose of the binarization scheme is to localize the impedance discontinuities in range, by assigning a '1' to the peaks of the inverse filtered output, and '0' to all other values. The paper is concluded with simulation results showing the effectiveness of the proposed preprocessing technique.

  15. 新型3-CPS/RPPS机构的动力学分析%Dynamics Analysis of a Novel 3-CPS/RPPS Mechanism

    Institute of Scientific and Technical Information of China (English)

    李徽; 杨德华

    2013-01-01

    多自由度并联机构在各工程领域有着广泛的应用背景.在补充新型3-CPS/RPPS机构运动学中速度分析和加速度分析的基础上,以牛顿-欧拉方程为主要工具,对动力学反向问题进行了基础的理论研究和分析,以Pro/E动力学分析工具辅助完成了动力学正向问题.并在分析过程中,对比了传统Stewart平台和该新型串并机构.该分析方法和流程对多自由度机构研究提供借鉴和参考.最后通过对大型射电望远镜中主动面板的误差敏感分析,指出了该机构部分运动解耦的优势及其应用前景.

  16. A highly facile and selective Chemo-Paper-Sensor (CPS) for detection of strontium.

    Science.gov (United States)

    Kang, Sung-Min; Jang, Sung-Chan; Huh, Yun Suk; Lee, Chang-Soo; Roh, Changhyun

    2016-06-01

    Chemosensors have attracted increasing attention for their usefulness on-site detection and monitoring. In this study, we elucidated a novel, facile, and highly selective Chemo-Paper-Sensor (CPS) for detection and monitoring of strontium (Sr(2+)) ions, which means a potent colorimetric sensor based on a Chrysoidine G (CG)-coated paper strip. The CPS for highly selective colorimetric detection of strontium ion was handily analyzed to determine the red-green-blue (RGB) value using portable devices such as desktop digital scanner and mobile phone camera, quantitatively. Interestingly, an orange to dark orange color transition was observed when the aqueous and solid paper colorimetric sensor was introduced to Sr(2+) ion, respectively. It was demonstrated that the value of the signal has a linear relationship with concentrations of the strontium in the 500 ppm to 100 ppb range with a detection limit of 200 ppb. We believe that a newly developed Chemo-Paper-Sensor will be useful in a wide range of sensing applications.

  17. Bridging a Survey Redesign Using Multiple Imputation: An Application to the 2014 CPS ASEC

    Directory of Open Access Journals (Sweden)

    Rothbaum Jonathan

    2017-03-01

    Full Text Available The Current Population Survey Annual Social and Economic Supplement (CPS ASEC serves as the data source for official income, poverty, and inequality statistics in the United States. In 2014, the CPS ASEC questionnaire was redesigned to improve data quality and to reduce misreporting, item nonresponse, and errors resulting from respondent fatigue. The sample was split into two groups, with nearly 70% receiving the traditional instrument and 30% receiving the redesigned instrument. Due to the relatively small redesign sample, analyses of changes in income and poverty between this and future years may lack sufficient power, especially for subgroups. The traditional sample is treated as if the responses were missing for income sources targeted by the redesign, and multiple imputation is used to generate plausible responses. A flexible imputation technique is used to place individuals into strata along two dimensions: 1 their probability of income recipiency and 2 their expected income conditional on recipiency for each income source. By matching on these two dimensions, this approach combines the ideas of propensity score matching and predictive means matching. In this article, this approach is implemented, the matching models are evaluated using diagnostics, and the results are analyzed.

  18. UGV: security analysis of subsystem control network

    Science.gov (United States)

    Abbott-McCune, Sam; Kobezak, Philip; Tront, Joseph; Marchany, Randy; Wicks, Al

    2013-05-01

    Unmanned Ground vehicles (UGVs) are becoming prolific in the heterogeneous superset of robotic platforms. The sensors which provide odometry, localization, perception, and vehicle diagnostics are fused to give the robotic platform a sense of the environment it is traversing. The automotive industry CAN bus has dominated the industry due to the fault tolerance and the message structure allowing high priority messages to reach the desired node in a real time environment. UGVs are being researched and produced at an accelerated rate to preform arduous, repetitive, and dangerous missions that are associated with a military action in a protracted conflict. The technology and applications of the research will inevitably be turned into dual-use platforms to aid civil agencies in the performance of their various operations. Our motivation is security of the holistic system; however as subsystems are outsourced in the design, the overall security of the system may be diminished. We will focus on the CAN bus topology and the vulnerabilities introduced in UGVs and recognizable security vulnerabilities that are inherent in the communications architecture. We will show how data can be extracted from an add-on CAN bus that can be customized to monitor subsystems. The information can be altered or spoofed to force the vehicle to exhibit unwanted actions or render the UGV unusable for the designed mission. The military relies heavily on technology to maintain information dominance, and the security of the information introduced onto the network by UGVs must be safeguarded from vulnerabilities that can be exploited.

  19. Force protection demining system (FPDS) detection subsystem

    Science.gov (United States)

    Zachery, Karen N.; Schultz, Gregory M.; Collins, Leslie M.

    2005-06-01

    This study describes the U.S. Army Force Protection Demining System (FPDS); a remotely-operated, multisensor platform developed for reliable detection and neutralization of both anti-tank and anti-personnel landmines. The ongoing development of the prototype multisensor detection subsystem is presented, which integrates an advanced electromagnetic pulsed-induction array and ground penetrating synthetic aperture radar array on a single standoff platform. The FPDS detection subsystem is mounted on a robotic rubber-tracked vehicle and incorporates an accurate and precise navigation/positioning module making it well suited for operation in varied and irregular terrains. Detection sensors are optimally configured to minimize interference without loss in sensitivity or performance. Mine lane test data acquired from the prototype sensors are processed to extract signal- and image-based features for automatic target recognition. Preliminary results using optimal feature and classifier selection indicate the potential of the system to achieve high probabilities of detection while minimizing false alarms. The FPDS detection software system also exploits modern multi-sensor data fusion algorithms to provide real-time detection and discrimination information to the user.

  20. A gene cluster for amylovoran synthesis in Erwinia amylovora: characterization and relationship to cps genes in Erwinia stewartii.

    Science.gov (United States)

    Bernhard, F; Coplin, D L; Geider, K

    1993-05-01

    A large ams gene cluster required for production of the acidic extracellular polysaccharide (EPS) amylovoran by the fire blight pathogen Erwinia amylovora was cloned. Tn5 mutagenesis and gene replacement were used to construct chromosomal ams mutants. Five complementation groups, essential for amylovoran synthesis and virulence in E. amylovora, were identified and designated ams A-E. The ams gene cluster is about 7 kb in size and functionally equivalent to the cps gene cluster involved in EPS synthesis by the related pathogen Erwinia stewartii. Mucoidy and virulence were restored to E. stewartii mutants in four cps complementation groups by the cloned E. amylovora ams genes. Conversely, the E. stewartii cps gene cluster was able to complement mutations in E. amylovora ams genes. Correspondence was found between the amsA-E complementation groups and the cpsB-D region, but the arrangement of the genes appears to be different. EPS production and virulence were also restored to E. amylovora amsE and E. stewartii cpsD mutants by clones containing the Rhizobium meliloti exo A gene.

  1. Simple and Effective Way for Data Preprocessing Selection Based on Design of Experiments.

    Science.gov (United States)

    Gerretzen, Jan; Szymańska, Ewa; Jansen, Jeroen J; Bart, Jacob; van Manen, Henk-Jan; van den Heuvel, Edwin R; Buydens, Lutgarde M C

    2015-12-15

    The selection of optimal preprocessing is among the main bottlenecks in chemometric data analysis. Preprocessing currently is a burden, since a multitude of different preprocessing methods is available for, e.g., baseline correction, smoothing, and alignment, but it is not clear beforehand which method(s) should be used for which data set. The process of preprocessing selection is often limited to trial-and-error and is therefore considered somewhat subjective. In this paper, we present a novel, simple, and effective approach for preprocessing selection. The defining feature of this approach is a design of experiments. On the basis of the design, model performance of a few well-chosen preprocessing methods, and combinations thereof (called strategies) is evaluated. Interpretation of the main effects and interactions subsequently enables the selection of an optimal preprocessing strategy. The presented approach is applied to eight different spectroscopic data sets, covering both calibration and classification challenges. We show that the approach is able to select a preprocessing strategy which improves model performance by at least 50% compared to the raw data; in most cases, it leads to a strategy very close to the true optimum. Our approach makes preprocessing selection fast, insightful, and objective.

  2. The Pre-Processing of Images Technique for the Materia

    Directory of Open Access Journals (Sweden)

    Yevgeniy P. Putyatin

    2016-08-01

    Full Text Available The image processing analysis is one of the most powerful tool in various research fields, especially in material / polymer science. Therefore in the present article an attempt has been made for study of pre-processing of images technique of the material samples during the images taken out by Scanning Electron Microscope (SEM. First we prepared the material samples with coir fibre (natural and its polymer composite after that the image analysis has been performed by SEM technique and later on the said studies have been conducted. The results presented here were found satisfactory and also are in good agreement with our earlier work and some other worker in the same field.

  3. A Gender Recognition Approach with an Embedded Preprocessing

    Directory of Open Access Journals (Sweden)

    Md. Mostafijur Rahman

    2015-05-01

    Full Text Available Gender recognition from facial images has become an empirical aspect in present world. It is one of the main problems of computer vision and researches have been conducting on it. Though several techniques have been proposed, most of the techniques focused on facial images in controlled situation. But the problem arises when the classification is performed in uncontrolled conditions like high rate of noise, lack of illumination, etc. To overcome these problems, we propose a new gender recognition framework which first preprocess and enhances the input images using Adaptive Gama Correction with Weighting Distribution. We used Labeled Faces in the Wild (LFW database for our experimental purpose which contains real life images of uncontrolled condition. For measuring the performance of our proposed method, we have used confusion matrix, precision, recall, F-measure, True Positive Rate (TPR, and False Positive Rate (FPR. In every case, our proposed framework performs superior over other existing state-of-the-art techniques.

  4. Constant-overhead secure computation of Boolean circuits using preprocessing

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Zakarias, S.

    2013-01-01

    We present a protocol for securely computing a Boolean circuit C in presence of a dishonest and malicious majority. The protocol is unconditionally secure, assuming a preprocessing functionality that is not given the inputs. For a large number of players the work for each player is the same...... as computing the circuit in the clear, up to a constant factor. Our protocol is the first to obtain these properties for Boolean circuits. On the technical side, we develop new homomorphic authentication schemes based on asymptotically good codes with an additional multiplication property. We also show a new...... algorithm for verifying the product of Boolean matrices in quadratic time with exponentially small error probability, where previous methods only achieved constant error....

  5. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  6. Constant-Overhead Secure Computation of Boolean Circuits using Preprocessing

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Zakarias, Sarah Nouhad Haddad

    We present a protocol for securely computing a Boolean circuit $C$ in presence of a dishonest and malicious majority. The protocol is unconditionally secure, assuming access to a preprocessing functionality that is not given the inputs to compute on. For a large number of players the work done...... by each player is the same as the work needed to compute the circuit in the clear, up to a constant factor. Our protocol is the first to obtain these properties for Boolean circuits. On the technical side, we develop new homomorphic authentication schemes based on asymptotically good codes...... with an additional multiplication property. We also show a new algorithm for verifying the product of Boolean matrices in quadratic time with exponentially small error probability, where previous methods would only give a constant error....

  7. Pre-processing in AI based Prediction of QSARs

    CERN Document Server

    Patri, Om Prasad

    2009-01-01

    Machine learning, data mining and artificial intelligence (AI) based methods have been used to determine the relations between chemical structure and biological activity, called quantitative structure activity relationships (QSARs) for the compounds. Pre-processing of the dataset, which includes the mapping from a large number of molecular descriptors in the original high dimensional space to a small number of components in the lower dimensional space while retaining the features of the original data, is the first step in this process. A common practice is to use a mapping method for a dataset without prior analysis. This pre-analysis has been stressed in our work by applying it to two important classes of QSAR prediction problems: drug design (predicting anti-HIV-1 activity) and predictive toxicology (estimating hepatocarcinogenicity of chemicals). We apply one linear and two nonlinear mapping methods on each of the datasets. Based on this analysis, we conclude the nature of the inherent relationships betwee...

  8. Digital soil mapping: strategy for data pre-processing

    Directory of Open Access Journals (Sweden)

    Alexandre ten Caten

    2012-08-01

    Full Text Available The region of greatest variability on soil maps is along the edge of their polygons, causing disagreement among pedologists about the appropriate description of soil classes at these locations. The objective of this work was to propose a strategy for data pre-processing applied to digital soil mapping (DSM. Soil polygons on a training map were shrunk by 100 and 160 m. This strategy prevented the use of covariates located near the edge of the soil classes for the Decision Tree (DT models. Three DT models derived from eight predictive covariates, related to relief and organism factors sampled on the original polygons of a soil map and on polygons shrunk by 100 and 160 m were used to predict soil classes. The DT model derived from observations 160 m away from the edge of the polygons on the original map is less complex and has a better predictive performance.

  9. Real-Time Rendering of Teeth with No Preprocessing

    DEFF Research Database (Denmark)

    Larsen, Christian Thode; Frisvad, Jeppe Revall; Jensen, Peter Dahl Ejby

    2012-01-01

    We present a technique for real-time rendering of teeth with no need for computational or artistic preprocessing. Teeth constitute a translucent material consisting of several layers; a highly scattering material (dentine) beneath a semitransparent layer (enamel) with a transparent coating (saliva......). In this study we examine how light interacts with this multilayered structure. In the past, rendering of teeth has mostly been done using image-based texturing or volumetric scans. We work with surface scans and have therefore developed a simple way of estimating layer thicknesses. We use scattering properties...... based on measurements reported in the optics literature, and we compare rendered results qualitatively to images of ceramic teeth created by denturists....

  10. Sparse and Unique Nonnegative Matrix Factorization Through Data Preprocessing

    CERN Document Server

    Gillis, Nicolas

    2012-01-01

    Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different but equivalent factorizations. In this paper, we introduce a completely new way to obtaining more well-posed NMF problems whose solutions are sparser. Our technique is based on the preprocessing of the nonnegative input data matrix, and relies on the theory of M-matrices and the geometric interpretation of NMF. This approach provably leads to optimal and sparse solutions under the separability assumption of Donoho and Stodden (NIPS, 2003), and, for rank-three matrices, makes the number of exact factorizations finite. We illustrate the effectiveness of our technique on several image datasets.

  11. Statistics in experimental design, preprocessing, and analysis of proteomics data.

    Science.gov (United States)

    Jung, Klaus

    2011-01-01

    High-throughput experiments in proteomics, such as 2-dimensional gel electrophoresis (2-DE) and mass spectrometry (MS), yield usually high-dimensional data sets of expression values for hundreds or thousands of proteins which are, however, observed on only a relatively small number of biological samples. Statistical methods for the planning and analysis of experiments are important to avoid false conclusions and to receive tenable results. In this chapter, the most frequent experimental designs for proteomics experiments are illustrated. In particular, focus is put on studies for the detection of differentially regulated proteins. Furthermore, issues of sample size planning, statistical analysis of expression levels as well as methods for data preprocessing are covered.

  12. Preprocessing in a Tiered Sensor Network for Habitat Monitoring

    Directory of Open Access Journals (Sweden)

    Hanbiao Wang

    2003-03-01

    Full Text Available We investigate task decomposition and collaboration in a two-tiered sensor network for habitat monitoring. The system recognizes and localizes a specified type of birdcalls. The system has a few powerful macronodes in the first tier, and many less powerful micronodes in the second tier. Each macronode combines data collected by multiple micronodes for target classification and localization. We describe two types of lightweight preprocessing which significantly reduce data transmission from micronodes to macronodes. Micronodes classify events according to their cross-zero rates and discard irrelevant events. Data about events of interest is reduced and compressed before being transmitted to macronodes for target localization. Preliminary experiments illustrate the effectiveness of event filtering and data reduction at micronodes.

  13. Data acquisition and preprocessing techniques for remote sensing field research

    Science.gov (United States)

    Biehl, L. L.; Robinson, B. F.

    1983-01-01

    A crops and soils data base has been developed at Purdue University's Laboratory for Applications of Remote Sensing using spectral and agronomic measurements made by several government and university researchers. The data are being used to (1) quantitatively determine the relationships of spectral and agronomic characteristics of crops and soils, (2) define future sensor systems, and (3) develop advanced data analysis techniques. Researchers follow defined data acquisition and preprocessing techniques to provide fully annotated and calibrated sets of spectral, agronomic, and meteorological data. These procedures enable the researcher to combine his data with that acquired by other researchers for remote sensing research. The key elements or requirements for developing a field research data base of spectral data that can be transported across sites and years are appropriate experiment design, accurate spectral data calibration, defined field procedures, and through experiment documentation.

  14. Radar image preprocessing. [of SEASAT-A SAR data

    Science.gov (United States)

    Frost, V. S.; Stiles, J. A.; Holtzman, J. C.; Held, D. N.

    1980-01-01

    Standard image processing techniques are not applicable to radar images because of the coherent nature of the sensor. Therefore there is a need to develop preprocessing techniques for radar images which will then allow these standard methods to be applied. A random field model for radar image data is developed. This model describes the image data as the result of a multiplicative-convolved process. Standard techniques, those based on additive noise and homomorphic processing are not directly applicable to this class of sensor data. Therefore, a minimum mean square error (MMSE) filter was designed to treat this class of sensor data. The resulting filter was implemented in an adaptive format to account for changes in local statistics and edges. A radar image processing technique which provides the MMSE estimate inside homogeneous areas and tends to preserve edge structure was the result of this study. Digitally correlated Seasat-A synthetic aperture radar (SAR) imagery was used to test the technique.

  15. Multiple Criteria Decision-Making Preprocessing Using Data Mining Tools

    CERN Document Server

    Mosavi, A

    2010-01-01

    Real-life engineering optimization problems need Multiobjective Optimization (MOO) tools. These problems are highly nonlinear. As the process of Multiple Criteria Decision-Making (MCDM) is much expanded most MOO problems in different disciplines can be classified on the basis of it. Thus MCDM methods have gained wide popularity in different sciences and applications. Meanwhile the increasing number of involved components, variables, parameters, constraints and objectives in the process, has made the process very complicated. However the new generation of MOO tools has made the optimization process more automated, but still initializing the process and setting the initial value of simulation tools and also identifying the effective input variables and objectives in order to reach the smaller design space are still complicated. In this situation adding a preprocessing step into the MCDM procedure could make a huge difference in terms of organizing the input variables according to their effects on the optimizati...

  16. Preprocessing Solar Images while Preserving their Latent Structure

    CERN Document Server

    Stein, Nathan M; Kashyap, Vinay L

    2015-01-01

    Telescopes such as the Atmospheric Imaging Assembly aboard the Solar Dynamics Observatory, a NASA satellite, collect massive streams of high resolution images of the Sun through multiple wavelength filters. Reconstructing pixel-by-pixel thermal properties based on these images can be framed as an ill-posed inverse problem with Poisson noise, but this reconstruction is computationally expensive and there is disagreement among researchers about what regularization or prior assumptions are most appropriate. This article presents an image segmentation framework for preprocessing such images in order to reduce the data volume while preserving as much thermal information as possible for later downstream analyses. The resulting segmented images reflect thermal properties but do not depend on solving the ill-posed inverse problem. This allows users to avoid the Poisson inverse problem altogether or to tackle it on each of $\\sim$10 segments rather than on each of $\\sim$10$^7$ pixels, reducing computing time by a facto...

  17. Prediction of speech intelligibility based on an auditory preprocessing model

    DEFF Research Database (Denmark)

    Christiansen, Claus Forup Corlin; Pedersen, Michael Syskind; Dau, Torsten

    2010-01-01

    Classical speech intelligibility models, such as the speech transmission index (STI) and the speech intelligibility index (SII) are based on calculations on the physical acoustic signals. The present study predicts speech intelligibility by combining a psychoacoustically validated model of auditory...... preprocessing [Dau et al., 1997. J. Acoust. Soc. Am. 102, 2892-2905] with a simple central stage that describes the similarity of the test signal with the corresponding reference signal at a level of the internal representation of the signals. The model was compared with previous approaches, whereby a speech...... in noise experiment was used for training and an ideal binary mask experiment was used for evaluation. All three models were able to capture the trends in the speech in noise training data well, but the proposed model provides a better prediction of the binary mask test data, particularly when the binary...

  18. HYBRID FUEL CELL-SOLAR CELL SPACE POWER SUBSYSTEM CAPABILITY.

    Science.gov (United States)

    This report outlines the capabilities and limitations of a hybrid solar cell- fuel cell space power subsystem by comparing the proposed hybrid system...to conventional power subsystem devices. The comparisons are based on projected 1968 capability in the areas of primary and secondary battery, fuel ... cell , solar cell, and chemical dynamic power subsystems. The purpose of the investigation was to determine the relative merits of a hybrid power

  19. Local subsystems in gauge theory and gravity

    CERN Document Server

    Donnelly, William

    2016-01-01

    We consider the problem of defining localized subsystems in gauge theory and gravity. Such systems are associated to spacelike hypersurfaces with boundaries and provide the natural setting for studying entanglement entropy of regions of space. We present a general formalism to associate a gauge-invariant classical phase space to a spatial slice with boundary by introducing new degrees of freedom on the boundary. In Yang-Mills theory the new degrees of freedom are a choice of gauge on the boundary, transformations of which are generated by the normal component of the nonabelian electric field. In general relativity the new degrees of freedom are the location of a codimension-2 surface and a choice of conformal normal frame. These degrees of freedom transform under a group of surface symmetries, consisting of diffeomorphisms of the codimension-2 boundary, and position-dependent linear deformations of its normal plane. We find the observables which generate these symmetries, consisting of the conformal normal me...

  20. Development of light metals automotive structural subsystems

    Energy Technology Data Exchange (ETDEWEB)

    Luo, A.A.; Sachdev, A.K. [General Motors Research and Development Center, Warren, MI (United States)

    2007-07-01

    Key technological developments in aluminum and magnesium alloys were reviewed in relation to the manufacturing processes that enable lightweight automotive structural subsystems. Examples included the materials and processes evolution of lightweight body structures, chassis systems, and instrument panel beams. New aluminum and magnesium alloys and manufacturing technologies used to reduce mass and improve performance in vehicle cradle structures were discussed. Hydroforming processes used to enable the use of lightweight aluminum alloy tubes in automotive body structures were also reviewed, in addition to body architectures enabled by different materials and manufacturing processes. The review noted that magnesium instrument panels are now being designed to provide significant performance improvement, reduced vibration, and enhanced crashworthiness in new automobiles. It was concluded that vehicles will incorporate more lightweight materials such as nanocomposites and aluminum and magnesium sheets. 9 refs., 10 figs.

  1. Plant development, auxin, and the subsystem incompleteness theorem.

    Science.gov (United States)

    Niklas, Karl J; Kutschera, Ulrich

    2012-01-01

    Plant morphogenesis (the process whereby form develops) requires signal cross-talking among all levels of organization to coordinate the operation of metabolic and genomic subsystems operating in a larger network of subsystems. Each subsystem can be rendered as a logic circuit supervising the operation of one or more signal-activated system. This approach simplifies complex morphogenetic phenomena and allows for their aggregation into diagrams of progressively larger networks. This technique is illustrated here by rendering two logic circuits and signal-activated subsystems, one for auxin (IAA) polar/lateral intercellular transport and another for IAA-mediated cell wall loosening. For each of these phenomena, a circuit/subsystem diagram highlights missing components (either in the logic circuit or in the subsystem it supervises) that must be identified experimentally if each of these basic plant phenomena is to be fully understood. We also illustrate the "subsystem incompleteness theorem," which states that no subsystem is operationally self-sufficient. Indeed, a whole-organism perspective is required to understand even the most simple morphogenetic process, because, when isolated, every biological signal-activated subsystem is morphogenetically ineffective.

  2. The American Cancer Society's Cancer Prevention Study 3 (CPS-3): Recruitment, study design, and baseline characteristics.

    Science.gov (United States)

    Patel, Alpa V; Jacobs, Eric J; Dudas, Daniela M; Briggs, Peter J; Lichtman, Cari J; Bain, Elizabeth B; Stevens, Victoria L; McCullough, Marjorie L; Teras, Lauren R; Campbell, Peter T; Gaudet, Mia M; Kirkland, Elizabeth G; Rittase, Melissa H; Joiner, Nance; Diver, W Ryan; Hildebrand, Janet S; Yaw, Nancy C; Gapstur, Susan M

    2017-06-01

    Prospective cohort studies contribute importantly to understanding the role of lifestyle, genetic, and other factors in chronic disease etiology. The American Cancer Society (ACS) recruited a new prospective cohort study, Cancer Prevention Study 3 (CPS-3), between 2006 and 2013 from 35 states and Puerto Rico. Enrollment took place primarily at ACS community events and at community enrollment "drives." At enrollment sites, participants completed a brief survey that included an informed consent, identifying information necessary for follow-up, and key exposure information. They also provided a waist measure and a nonfasting blood sample. Most participants also completed a more comprehensive baseline survey at home that included extensive medical, lifestyle, and other information. Participants will be followed for incident cancers through linkage with state cancer registries and for cause-specific mortality through linkage with the National Death Index. In total, 303,682 participants were enrolled. Of these, 254,650 completed the baseline survey and are considered "fully" enrolled; they will be sent repeat surveys periodically for at least the next 20 years to update exposure information. The remaining participants (n = 49,032) will not be asked to update exposure information but will be followed for outcomes. Twenty-three percent of participants were men, 17.3% reported a race or ethnicity other than "white," and the median age at enrollment was 47 years. CPS-3 will be a valuable resource for studies of cancer and other outcomes because of its size; its diversity with respect to age, ethnicity, and geography; and the availability of blood samples and detailed questionnaire information collected over time. Cancer 2017;123:2014-2024. © 2017 American Cancer Society. © 2017 American Cancer Society.

  3. Geostatistics: application of kriging and conditional simulation techniques in the reservoirs CPS-2 of the Carmopolis Field, Sergipe State, Brazil; Geoestatistica: Aplicacao das tecnicas de krigagem e simulacao condicional nos reservatorios CPS-2 do Campo de Carmopolis, Sergipe, Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Souza, M.J. de [PETROBRAS, Rio de Janeiro, RJ (Brazil); Marcotte, D. [Montreal Univ., PQ (Canada). Dept. de Genie Mineral

    1992-07-01

    Description of the parameters used to do an evaluation of wells oil production are discussed. The global estimate production of the CPS-2 (PETROBRAS oil well) on the Carmopolis Field, using geostatistical techniques and a comparative study of the oil production simulation using the kriging techniques are also presented. 16 figs., 2 tabs., 30 refs.

  4. Performance of Pre-processing Schemes with Imperfect Channel State Information

    DEFF Research Database (Denmark)

    Christensen, Søren Skovgaard; Kyritsi, Persa; De Carvalho, Elisabeth

    2006-01-01

    Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER and the high...

  5. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    Skibsted, E.; Boelens, H.F.M.; Westerhuis, J.A.; Witte, D.T.; Smilde, A.K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing

  6. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    Skibsted, E.; Boelens, H.F.M.; Westerhuis, J.A.; Witte, D.T.; Smilde, A.K.

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing tec

  7. Ensemble preprocessing of near-infrared (NIR) spectra for multivariate calibration.

    Science.gov (United States)

    Xu, Lu; Zhou, Yan-Ping; Tang, Li-Juan; Wu, Hai-Long; Jiang, Jian-Hui; Shen, Guo-Li; Yu, Ru-Qin

    2008-06-01

    Preprocessing of raw near-infrared (NIR) spectral data is indispensable in multivariate calibration when the measured spectra are subject to significant noises, baselines and other undesirable factors. However, due to the lack of sufficient prior information and an incomplete knowledge of the raw data, NIR spectra preprocessing in multivariate calibration is still trial and error. How to select a proper method depends largely on both the nature of the data and the expertise and experience of the practitioners. This might limit the applications of multivariate calibration in many fields, where researchers are not very familiar with the characteristics of many preprocessing methods unique in chemometrics and have difficulties to select the most suitable methods. Another problem is many preprocessing methods, when used alone, might degrade the data in certain aspects or lose some useful information while improving certain qualities of the data. In order to tackle these problems, this paper proposes a new concept of data preprocessing, ensemble preprocessing method, where partial least squares (PLSs) models built on differently preprocessed data are combined by Monte Carlo cross validation (MCCV) stacked regression. Little or no prior information of the data and expertise are required. Moreover, fusion of complementary information obtained by different preprocessing methods often leads to a more stable and accurate calibration model. The investigation of two real data sets has demonstrated the advantages of the proposed method.

  8. A New Indicator for Optimal Preprocessing and Wavelengths Selection of Near-Infrared Spectra

    NARCIS (Netherlands)

    E. Skibsted; H.F.M. Boelens; J.A. Westerhuis; D.T. Witte; A.K. Smilde

    2004-01-01

    Preprocessing of near-infrared spectra to remove unwanted, i.e., non-related spectral variation and selection of informative wavelengths is considered to be a crucial step prior to the construction of a quantitative calibration model. The standard methodology when comparing various preprocessing tec

  9. Automatic selection of preprocessing methods for improving predictions on mass spectrometry protein profiles.

    Science.gov (United States)

    Pelikan, Richard C; Hauskrecht, Milos

    2010-11-13

    Mass spectrometry proteomic profiling has potential to be a useful clinical screening tool. One obstacle is providing a standardized method for preprocessing the noisy raw data. We have developed a system for automatically determining a set of preprocessing methods among several candidates. Our system's automated nature relieves the analyst of the need to be knowledgeable about which methods to use on any given dataset. Each stage of preprocessing is approached with many competing methods. We introduce metrics which are used to balance each method's attempts to correct noise versus preserving valuable discriminative information. We demonstrate the benefit of our preprocessing system on several SELDI and MALDI mass spectrometry datasets. Downstream classification is improved when using our system to preprocess the data.

  10. Giada improved calibration of measurement subsystems

    Science.gov (United States)

    Della Corte, V.; Rotundi, A.; Sordini, R.; Accolla, M.; Ferrari, M.; Ivanovski, S.; Lucarelli, F.; Mazzotta Epifani, E.; Palumbo, P.

    2014-12-01

    GIADA (Grain Impact Analyzer and Dust Accumulator) is an in-situ instrument devoted to measure the dynamical properties of the dust grains emitted by the comet. An Extended Calibration activity using the GIADA Flight Spare Model has been carried out taking into account the knowledge gained through the analyses of IDPs and cometary samples returned from comet 81P/Wild 2. GIADA consists of three measurement subsystems: Grain Detection System, an optical device measuring the optical cross-section for individual dust; Impact Sensor an aluminum plate connected to 5 piezo-sensors measuring the momentum of impacting single dust grains; Micro Balance System measuring the cumulative deposition in time of dust grains smaller than 10 μm. The results of the analyses on data acquired with the GIADA PFM and the comparison with calibration data acquired during the pre-launch campaign allowed us to improve GIADA performances and capabilities. We will report the results of the following main activities: a) definition of a correlation between the 2 GIADA Models (PFM housed in laboratory and In-Flight Model on-board ROSETTA); b) characterization of the sub-systems performances (signal elaboration, sensitivities, space environment effects); c) new calibration measurements and related curves by means of the PFM model using realistic cometary dust analogues. Acknowledgements: GIADA was built by a consortium led by the Univ. Napoli "Parthenope" & INAF-Oss. Astr. Capodimonte, IT, in collaboration with the Inst. de Astrofisica de Andalucia, ES, Selex-ES s.p.a. and SENER. GIADA is presently managed & operated by Ist. di Astrofisica e Planetologia Spaziali-INAF, IT. GIADA was funded and managed by the Agenzia Spaziale Italiana, IT, with a support of the Spanish Ministry of Education and Science MEC, ES. GIADA was developed from a University of Kent, UK, PI proposal; sci. & tech. contribution given by CISAS, IT, Lab. d'Astr. Spat., FR, and Institutions from UK, IT, FR, DE and USA. We thank

  11. Development of computer simulation models for pedestrian subsystem impact tests

    NARCIS (Netherlands)

    Kant, R.; Konosu, A.; Ishikawa, H.

    2000-01-01

    The European Enhanced Vehicle-safety Committee (EEVC/WG10 and WG17) proposed three component subsystem tests for cars to assess pedestrian protection. The objective of this study is to develop computer simulation models of the EEVC pedestrian subsystem tests. These models are available to develop a

  12. Observations in the SPS of beam with various longitudinal parameters and extracted from the CPS using CT

    CERN Document Server

    Bohl, T

    2009-01-01

    In view of extracting fixed target type of beams for SPS fixed target physics or CNGS operation with the Multi-Turn-Extraction (MTE) scheme, beams with certain sets of longitudinal parameters were produced in the CPS and their transmission in the SPS was studied.

  13. Autophosphorylation of the Bacterial Tyrosine-Kinase CpsD Connects Capsule Synthesis with the Cell Cycle in Streptococcus pneumoniae

    NARCIS (Netherlands)

    Nourikyan, Julien; Kjos, Morten; Mercy, Chryslène; Cluzel, Caroline; Morlot, Cécile; Noirot-Gros, Marie-Francoise; Guiral, Sébastien; Lavergne, Jean-Pierre; Veening, Jan-Willem; Grangeasse, Christophe

    2015-01-01

    Bacterial capsular polysaccharides (CPS) are produced by a multi-protein membrane complex, in which a particular type of tyrosine-autokinases named BY-kinases, regulate their polymerization and export. However, our understanding of the role of BY-kinases in these processes remains incomplete. In the

  14. Effects of Combined Elicitors on Tanshinone Metabolic Profiling and SmCPS Expression in Salvia miltiorrhiza Hairy Root Cultures

    Directory of Open Access Journals (Sweden)

    Yujia Liu

    2013-06-01

    Full Text Available Tanshinones are abietane-type norditerpenoid quinone natural products found in a well-known traditional Chinese medicinal herb, Salvia miltiorrhiza Bunge. The copalyl diphosphate synthase of S. miltiorrhiza (SmCPS is the key enzyme in the first step for transformation of geranylgeranyl diphosphate (GGPP into miltiradiene, which has recently been identified as the precursor of tanshinones. Based on previous gene-to-metabolite network, this study examined the influences of various combined elicitors on the expression of SmCPS and production of tanshinones in S. miltiorrhiza hairy root cultures. Combined elicitors were composed of three classes of elicitors, a heavy metal ion (Ag+, a polysaccharide (yeast extract, YE, and a plant response-signalling compound (methyl jasmonate, MJ. YE + Ag+, Ag+ + MJ, YE + MJ, and YE + Ag+ + MJ were the combinations we tested. The effect of elicitors on the SmCPS expression level was detected by quantitative real-time PCR (qRT-PCR, and the tanshinones accumulation responses to elicitation were analysed by Ultra Performance Liquid Chromatography (UPLC metabolite profiling. Of these combined elicitors, the expression of SmCPS was significantly enhanced by elicitation, especially at 24 h and 36 h. Of four tanshinones detected, the contents of cryptotanshinone and dihydrotanshinone I were enhanced by treatment with YE + Ag+, Ag+ + MJ, and YE + Ag+ + MJ. Our results indicate that appropriate combined elicitors can enhance tanshinones production in hairy root cultures.

  15. Simulating the Various Subsystems of a Coal Mine

    Directory of Open Access Journals (Sweden)

    V. Okolnishnikov

    2016-06-01

    Full Text Available A set of simulation models of various subsystems of a coal mine was developed with the help of a new visual interactive simulation system of technological processes. This paper contains a brief description of this simulation system and its possibilities. The main possibilities provided by the simulation system are: the quick construction of models from library elements, 3D representation, and the communication of models with actual control systems. These simulation models were developed for the simulation of various subsystems of a coal mine: underground conveyor network subsystems, pumping subsystems and coal face subsystems. These simulation models were developed with the goal to be used as a quality and reliability assurance tool for new process control systems in coal mining.

  16. Assessing Quality across Health Care Subsystems in Mexico

    Science.gov (United States)

    Puig, Andrea; Pagán, José A.; Wong, Rebeca

    2012-01-01

    Recent healthcare reform efforts in Mexico have focused on the need to improve the efficiency and equity of a fragmented healthcare system. In light of these reform initiatives, there is a need to assess whether healthcare subsystems are effective at providing high-quality healthcare to all Mexicans. Nationally representative household survey data from the 2006 Encuesta Nacional de Salud y Nutrición (National Health and Nutrition Survey) were used to assess perceived healthcare quality across different subsystems. Using a sample of 7234 survey respondents, we found evidence of substantial heterogeneity in healthcare quality assessments across healthcare subsystems favoring private providers over social security institutions. These differences across subsystems remained even after adjusting for socioeconomic, demographic, and health factors. Our analysis suggests that improvements in efficiency and equity can be achieved by assessing the factors that contribute to heterogeneity in quality across subsystems. PMID:19305224

  17. Presence in the IP Multimedia Subsystem

    Directory of Open Access Journals (Sweden)

    Ling Lin

    2007-01-01

    Full Text Available With an ever increasing penetration of Internet Protocol (IP technologies, the wireless industry is evolving the mobile core network towards all-IP network. The IP Multimedia Subsystem (IMS is a standardised Next Generation Network (NGN architectural framework defined by the 3rd Generation Partnership Project (3GPP to bridge the gap between circuit-switched and packet-switched networks and consolidate both sides into on single all-IP network for all services. In this paper, we provide an insight into the limitation of the presence service, one of the fundamental building blocks of the IMS. Our prototype-based study is unique of its kind and helps identifying the factors which limit the scalability of the current version of the presence service (3GPP TS 23.141 version 7.2.0 Release 7 [1], which will in turn dramatically limit the performance of advanced IMS services. We argue that the client-server paradigm behind the current IMS architecture does not suite the requirements of the IMS system, which defies the very purpose of its introduction. We finally elaborate on possible avenues for addressing this problem.

  18. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  19. Breast image pre-processing for mammographic tissue segmentation.

    Science.gov (United States)

    He, Wenda; Hogg, Peter; Juette, Arne; Denton, Erika R E; Zwiggelaar, Reyer

    2015-12-01

    During mammographic image acquisition, a compression paddle is used to even the breast thickness in order to obtain optimal image quality. Clinical observation has indicated that some mammograms may exhibit abrupt intensity change and low visibility of tissue structures in the breast peripheral areas. Such appearance discrepancies can affect image interpretation and may not be desirable for computer aided mammography, leading to incorrect diagnosis and/or detection which can have a negative impact on sensitivity and specificity of screening mammography. This paper describes a novel mammographic image pre-processing method to improve image quality for analysis. An image selection process is incorporated to better target problematic images. The processed images show improved mammographic appearances not only in the breast periphery but also across the mammograms. Mammographic segmentation and risk/density classification were performed to facilitate a quantitative and qualitative evaluation. When using the processed images, the results indicated more anatomically correct segmentation in tissue specific areas, and subsequently better classification accuracies were achieved. Visual assessments were conducted in a clinical environment to determine the quality of the processed images and the resultant segmentation. The developed method has shown promising results. It is expected to be useful in early breast cancer detection, risk-stratified screening, and aiding radiologists in the process of decision making prior to surgery and/or treatment.

  20. Adaptive preprocessing algorithms of corneal topography in polar coordinate system

    Institute of Scientific and Technical Information of China (English)

    郭雁文

    2014-01-01

    New adaptive preprocessing algorithms based on the polar coordinate system were put forward to get high-precision corneal topography calculation results. Adaptive locating algorithms of concentric circle center were created to accurately capture the circle center of original Placido-based image, expand the image into matrix centered around the circle center, and convert the matrix into the polar coordinate system with the circle center as pole. Adaptive image smoothing treatment was followed and the characteristics of useful circles were extracted via horizontal edge detection, based on useful circles presenting approximate horizontal lines while noise signals presenting vertical lines or different angles. Effective combination of different operators of morphology were designed to remedy data loss caused by noise disturbances, get complete image about circle edge detection to satisfy the requests of precise calculation on follow-up parameters. The experimental data show that the algorithms meet the requirements of practical detection with characteristics of less data loss, higher data accuracy and easier availability.

  1. Multimodal image fusion with SIMS: Preprocessing with image registration.

    Science.gov (United States)

    Tarolli, Jay Gage; Bloom, Anna; Winograd, Nicholas

    2016-06-14

    In order to utilize complementary imaging techniques to supply higher resolution data for fusion with secondary ion mass spectrometry (SIMS) chemical images, there are a number of aspects that, if not given proper consideration, could produce results which are easy to misinterpret. One of the most critical aspects is that the two input images must be of the same exact analysis area. With the desire to explore new higher resolution data sources that exists outside of the mass spectrometer, this requirement becomes even more important. To ensure that two input images are of the same region, an implementation of the insight segmentation and registration toolkit (ITK) was developed to act as a preprocessing step before performing image fusion. This implementation of ITK allows for several degrees of movement between two input images to be accounted for, including translation, rotation, and scale transforms. First, the implementation was confirmed to accurately register two multimodal images by supplying a known transform. Once validated, two model systems, a copper mesh grid and a group of RAW 264.7 cells, were used to demonstrate the use of the ITK implementation to register a SIMS image with a microscopy image for the purpose of performing image fusion.

  2. Software for Preprocessing Data from Rocket-Engine Tests

    Science.gov (United States)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  3. Nonlinear preprocessing method for detecting peaks from gas chromatograms

    Directory of Open Access Journals (Sweden)

    Min Hyeyoung

    2009-11-01

    Full Text Available Abstract Background The problem of locating valid peaks from data corrupted by noise frequently arises while analyzing experimental data. In various biological and chemical data analysis tasks, peak detection thus constitutes a critical preprocessing step that greatly affects downstream analysis and eventual quality of experiments. Many existing techniques require the users to adjust parameters by trial and error, which is error-prone, time-consuming and often leads to incorrect analysis results. Worse, conventional approaches tend to report an excessive number of false alarms by finding fictitious peaks generated by mere noise. Results We have designed a novel peak detection method that can significantly reduce parameter sensitivity, yet providing excellent peak detection performance and negligible false alarm rates from gas chromatographic data. The key feature of our new algorithm is the successive use of peak enhancement algorithms that are deliberately designed for a gradual improvement of peak detection quality. We tested our approach with real gas chromatograms as well as intentionally contaminated spectra that contain Gaussian or speckle-type noise. Conclusion Our results demonstrate that the proposed method can achieve near perfect peak detection performance while maintaining very small false alarm probabilities in case of gas chromatograms. Given the fact that biological signals appear in the form of peaks in various experimental data and that the propose method can easily be extended to such data, our approach will be a useful and robust tool that can help researchers highlight valid signals in their noisy measurements.

  4. Visualisation and pre-processing of peptide microarray data.

    Science.gov (United States)

    Reilly, Marie; Valentini, Davide

    2009-01-01

    The data files produced by digitising peptide microarray images contain detailed information on the location, feature, response parameters and quality of each spot on each array. In this chapter, we will describe how such peptide microarray data can be read into the R statistical package and pre-processed in preparation for subsequent comparative or predictive analysis. We illustrate how the information in the data can be visualised using images and graphical displays that highlight the main features, enabling the quality of the data to be assessed and invalid data points to be identified and excluded. The log-ratio of the foreground to background signal is used as a response index. Negative control responses serve as a reference against which "detectable" responses can be defined, and slides incubated with only buffer and secondary antibody help identify false-positive responses from peptides. For peptides that have a detectable response on at least one subarray, and no false-positive response, we use linear mixed models to remove artefacts due to the arrays and their architecture. The resulting normalized responses provide the input data for further analysis.

  5. 数据挖掘中的数据预处理%Data Preprocessing in Ddta Mining

    Institute of Scientific and Technical Information of China (English)

    刘明吉; 王秀峰; 黄亚楼

    2000-01-01

    Data Mining (DM) is a new hot research point in database area. Because the real-world data is not ideal,it is necessary to do some data preprocessing to meet the requirement of DM algorithms. In this paper,we discuss the procedure of data preprocessing and present the work of data preprocessing in details. We also discuss the methods and technologies used in data preprocessing.

  6. MSG Power Subsystem Flight Return Experience

    Science.gov (United States)

    Giacometti, G.; Canard, JP.; Perron, O.

    2011-10-01

    The Meteosat programme has been running for more than twenty years under ESA leadership. Meteosat Second Generation (MSG) is a series of 4 geostationary satellites developed and procured by the European Space Agency (ESA) on behalf of the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT). Eumetsat is still operating two of the first generation satellites models named MOP3 and MTP1 which are pointed towards the Indian ocean. The European meteorological service is now enhanced by two spacecrafts of the Second Generation (MSG-1 and MSG-2). They have been launched by Ariane 5 in August 2002 and December 2005 respectively. Thales Alenia Space, Prime Contractor of the program, has developed the MSG spacecraft based on a spin-axis stabilized technology. The Electrical Power Subsystem was subcontracted to Astrium GmbH. The Solar Array is composed of 8 body mounted panels, based on Carbon Fibre Reinforced Panel substrate. The Solar network utilizes 7854 Silicon High Eta cells delivering a beginning of life power of 740W. The 28 volts mainbus is regulated using a series shunt regulating concept (S3R type). Two identical SAFT batteries, built from NiCd cells and offering a 29Ah nameplate capacity are connected to the mainbus through battery discharge and charge regulators. Both Solar Array and batteries have been designed to provide power and energy for a nominal 7 years lifetime. These equipments are continuously monitored and are still operating in excellent condition after more than eight and five years in orbit. This paper will present the major electrical design aspects of the power chain and will describe the main parameters performances, which are analysed during the in-orbit operations. Batteries ageing is detailed thanks to reconditioning processed telemetry while the solar array performances over lifetime use dedicated solar array telemetry.

  7. Automated Pre-processing for NMR Assignments with Reduced Tedium

    Energy Technology Data Exchange (ETDEWEB)

    2004-05-11

    An important rate-limiting step in the reasonance asignment process is accurate identification of resonance peaks in MNR spectra. NMR spectra are noisy. Hence, automatic peak-picking programs must navigate between the Scylla of reliable but incomplete picking, and the Charybdis of noisy but complete picking. Each of these extremes complicates the assignment process: incomplete peak-picking results in the loss of essential connectivities, while noisy picking conceals the true connectivities under a combinatiorial explosion of false positives. Intermediate processing can simplify the assignment process by preferentially removing false peaks from noisy peak lists. This is accomplished by requiring consensus between multiple NMR experiments, exploiting a priori information about NMR spectra, and drawing on empirical statistical distributions of chemical shift extracted from the BioMagResBank. Experienced NMR practitioners currently apply many of these techniques "by hand", which is tedious, and may appear arbitrary to the novice. To increase efficiency, we have created a systematic and automated approach to this process, known as APART. Automated pre-processing has three main advantages: reduced tedium, standardization, and pedagogy. In the hands of experienced spectroscopists, the main advantage is reduced tedium (a rapid increase in the ratio of true peaks to false peaks with minimal effort). When a project is passed from hand to hand, the main advantage is standardization. APART automatically documents the peak filtering process by archiving its original recommendations, the accompanying justifications, and whether a user accepted or overrode a given filtering recommendation. In the hands of a novice, this tool can reduce the stumbling block of learning to differentiate between real peaks and noise, by providing real-time examples of how such decisions are made.

  8. Spatial-spectral preprocessing for endmember extraction on GPU's

    Science.gov (United States)

    Jimenez, Luis I.; Plaza, Javier; Plaza, Antonio; Li, Jun

    2016-10-01

    Spectral unmixing is focused in the identification of spectrally pure signatures, called endmembers, and their corresponding abundances in each pixel of a hyperspectral image. Mainly focused on the spectral information contained in the hyperspectral images, endmember extraction techniques have recently included spatial information to achieve more accurate results. Several algorithms have been developed for automatic or semi-automatic identification of endmembers using spatial and spectral information, including the spectral-spatial endmember extraction (SSEE) where, within a preprocessing step in the technique, both sources of information are extracted from the hyperspectral image and equally used for this purpose. Previous works have implemented the SSEE technique in four main steps: 1) local eigenvectors calculation in each sub-region in which the original hyperspectral image is divided; 2) computation of the maxima and minima projection of all eigenvectors over the entire hyperspectral image in order to obtain a candidates pixels set; 3) expansion and averaging of the signatures of the candidate set; 4) ranking based on the spectral angle distance (SAD). The result of this method is a list of candidate signatures from which the endmembers can be extracted using various spectral-based techniques, such as orthogonal subspace projection (OSP), vertex component analysis (VCA) or N-FINDR. Considering the large volume of data and the complexity of the calculations, there is a need for efficient implementations. Latest- generation hardware accelerators such as commodity graphics processing units (GPUs) offer a good chance for improving the computational performance in this context. In this paper, we develop two different implementations of the SSEE algorithm using GPUs. Both are based on the eigenvectors computation within each sub-region of the first step, one using the singular value decomposition (SVD) and another one using principal component analysis (PCA). Based

  9. The LEIR LLRF DSP-Carrier Board : Performance, CPS Renovation Plan and Recommendations

    CERN Document Server

    Angoletta, M E

    2007-01-01

    The LEIR LLRF project started in late 2003 and included designing,manufacturing and commissioning a novel, all-digital beam control system. The project was first to provide the LEIR machine with a beam control system satisfying the many performance requirements. This was achieved in 2006 with the successful LEIR LLRF system commissioning. In addition, the project was to act as a pilot to export the same technology to the other machines of the PS Complex (CPS), such as PS, PSB and AD. New machines currently being proposed (e.g. ELENA) will also rely on it. The evaluation of the LEIR experience and the recommendations on how to best pursue this migration strategy are therefore integral parts of the LEIR LLRF project. A fundamental building block of the LEIR LLRF system is the DSP-carrier board where all beam control loops are implemented. This note examines the main features of the DSP-carrier board release 1.0 used in LEIR and evaluates their impact on the LEIR LLRF implementation and operational performance. ...

  10. Development of a preprototype times wastewater recovery subsystem

    Science.gov (United States)

    Roebelen, G. J., Jr.; Dehner, G. F.

    1982-01-01

    A three-man wastewater recovery preprototype subsystem using a hollow fiber membrane evaporator with a thermoelectric heat pump to provide efficient potable water recovery from wastewater on extended duration space flights was designed, fabricated, and tested at one-gravity. Low power, compactness and gravity insensitive operation are featured in this vacuum distillation subsystem. The tubular hollow fiber elements provide positive liquid/gas phase control with no moving parts, and provide structural integrity, improving on previous flat sheet membrane designs. A thermoelectric heat pump provides latent energy recovery. Application and integration of these key elements solved problems inherent in all previous reclamation subsystem designs.

  11. Space Shuttle Orbiter leading edge structural subsystem thermal performance

    Science.gov (United States)

    Curry, D. M.; Cunningham, J. A.; Frahm, J. R.

    1982-01-01

    An extensive qualification test program and the STS-1 flight of the Space Shuttle Orbiter have provided the data necessary to verify the performance of the Orbiter thermal protection system. The reinforced carbon-carbon leading edge structural subsystem is used on areas of the orbiter where temperatures exceed 2300 F. The subsystem consists of the ROC nose cap and wing leading edge panels, metallic attachments, internal insulation, and interface tiles. Thermal response data from the qualification tests and the STS-1 flight, postflight inspection, and analytical predictions support the conclusion that the thermal performance of the subsystem verified the design.

  12. A Prototype of the Read-out Subsystem of the BESⅢ DAQ Based on PowerPC

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This article describes the prototype of the read-out subsystem which will be subject to the BESⅢ data acquisition system. According to the purpose of the BESⅢ, the event rate will be about 4000 Hz and the data rate up to 50 Mbytes/sec after Level 1 trigger. The read-out subsystem consists of some read-out crates and a read-out computer whose function is to initialize the hardware, to collect the event data from the front-end electronics after Level 1 trigger, to transfer data fragments to the computer in online form through two levels of computer pre-processing and high-speed network transmission. In this model, the crate level read-out implementation is based on the commercial single board computer MVME5100 running the VxWorks operating system.The article outlines the structure of the crate level testing platform of hardware and software. It puts emphasis on the framework of the read-out test model, data process flow and test method at crate level. Especially, it enumerates the key technologies in the process of design and analyses the test results. In addition, results which summarize the performance of the single board computer from the data transferring aspects will be presented.

  13. A Prototype of the Read-out Subsystem of the BESIII DAQ Based on PowerPC

    Science.gov (United States)

    Tao, Ning; Chu, Yuanping; Jin, Ge; Zhao, Jingwei

    2005-10-01

    This article describes the prototype of the read-out subsystem which will be subject to the BESIII data acquisition system. According to the purpose of the BESIII, the event rate will be about 4000 Hz and the data rate up to 50 Mbytes/sec after Level 1 trigger. The read-out subsystem consists of some read-out crates and a read-out computer whose function is to initialize the hardware, to collect the event data from the front-end electronics after Level 1 trigger, to transfer data fragments to the computer in online form through two levels of computer pre-processing and high-speed network transmission. In this model, the crate level read-out implementation is based on the commercial single board computer MVME5100 running the VxWorks operating system. The article outlines the structure of the crate level testing platform of hardware and software. It puts emphasis on the framework of the read-out test model, data process flow and test method at crate level. Especially, it enumerates the key technologies in the process of design and analyses the test results. In addition, results which summarize the performance of the single board computer from the data transferring aspects will be presented.

  14. The Construction of CPS Concept--based on PISA2015 CPS Studies%合作问题解决的概念建构*--基于PISA2015 CPS的研究

    Institute of Scientific and Technical Information of China (English)

    柏毅; 林娉婷

    2016-01-01

    CPS is a new academic concept in the ifeld of science education at home and abroad, which integrates traditional Problem Solving Competency with Collaboration skills and computer technique. Based on the PISA 2015 Draft Collaborative Problem Solving Framework (OECD, 2013)and by reviewing lately theoretical studies, this essay systematically evaluated PISA2015 CPS concept. In order to illustrate comprehensive concept of CPS, the essay studies the issues as follows: the Importance of Collaborative Problem Solving;deifning the domain;Collaborative Problem Solving processes and factors affecting CPS;and assessing Collaborative Problem Solving Competency.%在国内外科学教育研究领域,CPS是一个全新的学术概念,融合了传统的问题解决能力,合作能力以及计算机技术。本文选取最新PISA2015合作问题解决框架草案为分析脚本,综合大量国际最新CPS理论研究,梳理并评价PISA2015 CPS的概念内涵。从合作问题解决的研究背景着手,以CPS历史沿革为载体,对关键词进行辨析;从CPS的构成,测评CPS的任务类型、影响CPS能力的其他因素以及CPS等级分类多个方面,全面展开CPS的概念建构。

  15. 基于TMS320F28335的改进型CPS-SPWM实现方法%Research on an Improved CPS-SPWM Realization Method Based on F28335

    Institute of Scientific and Technical Information of China (English)

    赵玺; 王笑非

    2011-01-01

    This paper presents improved way for CPS-SPWM of cascaded H-bridge converters,and the number of carrier is 1/2 of before.So the cost of the system is reduced and the reliability of system is improved. Based on TMS320F28335,the modulation method is realized for five level,and nine level converters.The experimental results show that by using TMS320F28335, CPS-SPWM is easily to be carried out.The converter works well and the control system is simple.So it can be practical in many fields.%这里对级联H桥型变换器的载波相移正弦脉宽调制(CPS-SPWM)技术进行了分析和改进,使所需的载波个数减少为原来的一半,节约了硬件资源开销,简化了实现方法,提高了系统效率.给出了在TMS320F28335上的具体实现方案,并分别对5电平和9电平级联H桥型变换器进行了实验验证.实验结果表明,基于TMS320F28335可方便地实现所提出的改进型CPS-SPWM技术,使得变换器工作良好,输出波形质量较高,并且简化了控制系统的设计方案,提高了系统可靠性,具有较好的应用前景.

  16. Triple3 Redundant Spacecraft Subsystems (T3RSS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — T3RSS is the system engineer's tool that allows a systematic approach to ensuring that even if one or more failures occur in a single component or subsystem, then...

  17. Review of Decoherence Free Subspaces, Noiseless Subsystems, and Dynamical Decoupling

    CERN Document Server

    Lidar, Daniel A

    2012-01-01

    Quantum information requires protection from the adverse affects of decoherence and noise. This review provides an introduction to the theory of decoherence-free subspaces, noiseless subsystems, and dynamical decoupling. It addresses quantum information preservation as well protected computation.

  18. Double Shell Tank (DST) Monitor and Control Subsystem Specification

    Energy Technology Data Exchange (ETDEWEB)

    BAFUS, R.R.

    2000-11-03

    This specification revises the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Monitor and Control Subsystem that supports the first phase of Waste Feed Delivery.

  19. Development of Pattern Recognition Options for Combining Safeguards Subsystems

    Energy Technology Data Exchange (ETDEWEB)

    Burr, Thomas L. [Los Alamos National Laboratory; Hamada, Michael S. [Los Alamos National Laboratory

    2012-08-24

    This talk reviews project progress in combining process monitoring data and nuclear material accounting data to improve the over nuclear safeguards system. Focus on 2 subsystems: (1) nuclear materials accounting (NMA); and (2) process monitoring (PM).

  20. Double Shell Tank (DST) Process Waste Sampling Subsystem Specification

    Energy Technology Data Exchange (ETDEWEB)

    RASMUSSEN, J.H.

    2000-05-03

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied to the Double-Shell Tank (DST) Process Waste Sampling Subsystem which supports the first phase of Waste Feed Delivery.

  1. Statistical Design Model (SDM) of satellite thermal control subsystem

    Science.gov (United States)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  2. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  3. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Science.gov (United States)

    Churchill, Nathan W; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline") significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each), demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  4. The Effects of Pre-processing Strategies for Pediatric Cochlear Implant Recipients

    Science.gov (United States)

    Rakszawski, Bernadette; Wright, Rose; Cadieux, Jamie H.; Davidson, Lisa S.; Brenner, Christine

    2016-01-01

    Background Cochlear implants (CIs) have been shown to improve children’s speech recognition over traditional amplification when severe to profound sensorineural hearing loss is present. Despite improvements, understanding speech at low-level intensities or in the presence of background noise remains difficult. In an effort to improve speech understanding in challenging environments, Cochlear Ltd. offers pre-processing strategies that apply various algorithms prior to mapping the signal to the internal array. Two of these strategies include Autosensitivity Control™ (ASC) and Adaptive Dynamic Range Optimization (ADRO®). Based on previous research, the manufacturer’s default pre-processing strategy for pediatrics’ everyday programs combines ASC+ADRO®. Purpose The purpose of this study is to compare pediatric speech perception performance across various pre-processing strategies while applying a specific programming protocol utilizing increased threshold (T) levels to ensure access to very low-level sounds. Research Design This was a prospective, cross-sectional, observational study. Participants completed speech perception tasks in four pre-processing conditions: no pre-processing, ADRO®, ASC, ASC+ADRO®. Study Sample Eleven pediatric Cochlear Ltd. cochlear implant users were recruited: six bilateral, one unilateral, and four bimodal. Intervention Four programs, with the participants’ everyday map, were loaded into the processor with different pre-processing strategies applied in each of the four positions: no pre-processing, ADRO®, ASC, and ASC+ADRO®. Data Collection and Analysis Participants repeated CNC words presented at 50 and 70 dB SPL in quiet and HINT sentences presented adaptively with competing R-Space noise at 60 and 70 dB SPL. Each measure was completed as participants listened with each of the four pre-processing strategies listed above. Test order and condition were randomized. A repeated-measures analysis of variance (ANOVA) was used to

  5. Real-Time Signal Processing Data Acquisition Subsystem

    Science.gov (United States)

    Sarafinas, George A.; Stein, Alan J.; Bisson, Kenneth J.

    A digital signal processing sub-system has been developed for a coherent carbon dioxide laser radar system at Lincoln Laboratory's Firepond Research Facility. This high-resolution radar is capable of operating with a variety of waveforms; hence, the signal processing requirements of the sub-system vary from one application to the next, and require a sub-system with a high degree of flexibility. The primary function of the Data Acquisition sub-system is to provide range-Doppler images in real-time. Based on this objective, the sub-system must have the ability to route large amounts of digitized data at high rates between specialized processors performing the functions of data acquisition, digital signal processing, archiving, and image processing. A distributed processing design approach was used and the hardware design implemented was configured using all off-the-shelf commercially available products. The sub-system uses a high speed 24 MB/sec central bus and associated processor acting as the hub of the system. Attached to the bus is a large RAM memory buffer. Also attached to the central bus are individual processors which interface to specialized peripherals, performing the tasks of digitizing, vector processing, imaging, and archiving. The software for the complete Data Acquisition and Signal Processing sub-system was developed on a Digital Equipment MicroVAX IITM computer. Software developed for the completed system is coded mostly in a high level language to promote flexibility, modularity, and reducing development time. Some microcode had to be used where speed is essential. All Software design, development, and testing was done under VMSTM.

  6. Biochemical Characterization of CPS-1, a Subclass B3 Metallo-β-Lactamase from a Chryseobacterium piscium Soil Isolate

    DEFF Research Database (Denmark)

    Gudeta, Dereje Dadi; Pollini, Simona; Docquier, Jean-Denis;

    2016-01-01

    CPS-1 is a subclass B3 metallo-β-lactamase from a Chryseobacterium piscium isolated from soil, showing 68 % amino acid identity to GOB-1 enzyme. CPS-1 was overproduced in Escherichia coli Rosetta (DE3), purified by chromatography and biochemically characterized. This enzyme exhibits a broad spect...... spectrum substrate profile including penicillins, cephalosporins and carbapenems, which overall resembles those of L1, GOB-1 and acquired subclass B3 enzymes AIM-1 and SMB-1.......CPS-1 is a subclass B3 metallo-β-lactamase from a Chryseobacterium piscium isolated from soil, showing 68 % amino acid identity to GOB-1 enzyme. CPS-1 was overproduced in Escherichia coli Rosetta (DE3), purified by chromatography and biochemically characterized. This enzyme exhibits a broad...

  7. Gamma Ray Array Detector Trigger Sub-System

    CERN Document Server

    Zhong-Wei, Du; Yi, Qian; KongJie,

    2012-01-01

    Gamma Ray Array Detector (GRAD) is one of External Target Facility (ETF) subsystems at the Heavy Ion Research Facility at Lanzhou. The trigger subsystem of the GRAD has been developed based on Field Programmable Gate Array (FPGAs) and PXI interface. The GRAD trigger subsystem makes prompt L1 trigger decisions to select valid events. These decisions are made by processing the hit signals from 1024 CsI scintillators of the GRAD. According to the physical requirements, the GRAD trigger subsystem generates 12-bit trigger signals that are passed to the ETF global trigger system. In addition, the GRAD trigger subsystem generates trigger data that are packed and transmitted to the host computer via PXI bus for off-line analysis. The trigger processing is implemented in the front-end electronics and one FPGA of the trigger module. The logic of PXI transmission and reconfiguration is implemented in the other FPGA of the trigger module. The reliable and efficient performance in the Gamma-ray experiments demonstrates th...

  8. Safe Operation of HIFI Local Oscillator Subsystem on Herschel Mission

    Science.gov (United States)

    Michalska, Malgorzata; Juchnikowski, Grzegorz; Klein, Thomas; Leinz, Christian; Nowosielski, Witold; Orleanski, Piotr; Ward, John

    The HIFI Local Oscillator Subsystem is part of the Heterodyne Instrument for Far Infrared (HIFI) dedicated for astronomical observations,to be mounted on the ESA satellite HER- SCHEL. The Subsystem provides the local oscillator signal (480-1910 GHz) to each of the fourteen HIFI input mixers. Part of LO, the Local Oscillator Control Unit (LCU) provides the main interface between Local Oscillator Subsystem and HIFI/Herschel power and telemetry buses. The unit supplies Local Oscillator, decodes the HIFI macro-commands, programs and monitors the parameters of Ka-Band Synthesizer and THz multiplier chains and controls the operation of the whole Local Oscillator Subsystem. The unique microwave components used in HF multipliers are extremely sensitive to the proper biasing (polarity, voltage, current, presence of HF power).The ESA strategy of this mission requires full safe operation of the instrument. This requirements is covered by complex protection system implemented inside LCU. In this paper, we present the general overview of the protection system of microwave components. The different levels of protection (hardware realization and software procedures) are described as well as various reliability aspects. The functionality of LO subsystem controlled by LCU was tested in 2007. Now the flight model of HIFI instrument is integrated with the satellite and will be launched with Herschel mission in July 2008.

  9. Double Shell Tank (DST) Monitor and Control Subsystem Specification

    Energy Technology Data Exchange (ETDEWEB)

    BAFUS, R.R.

    2000-04-27

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Monitor and Control Subsystem that supports the first phase of Waste Feed Delivery. This subsystem specification establishes the interface and performance requirements and provides references to the requisite codes and standards to be applied during the design of the Double-Shell Tank (DST) Monitor and Control Subsystem. The DST Monitor and Control Subsystem consists of the new and existing equipment that will be used to provide tank farm operators with integrated local monitoring and control of the DST systems to support Waste Feed Delivery (WFD). New equipment will provide automatic control and safety interlocks where required and provide operators with visibility into the status of DST subsystem operations (e.g., DST mixer pump operation and DST waste transfers) and the ability to manually control specified DST functions as necessary. This specification is intended to be the basis for new project/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program.

  10. The Main Subsystems Involved in Defining the Quality Management System in a Hospital

    Directory of Open Access Journals (Sweden)

    Dobrea Valentina Alina

    2010-06-01

    Full Text Available The hospital is the most important organization in health field, so they have to improve the quality in all the activities deployed. A very suitable way to show the hospital’s preoccupation for quality of health services is the quality management system certificate according ISO 9001/2000. In understanding the architecture of the hospital quality management system is necessary to decompose this system in subsystems and analyze each separately: the managerial subsystem, the human subsystem, the social subsystem, thetechnical subsystem, the informative subsystem. The relationship between those subsystems leads to the continuous improvement of quality in health services.

  11. Examination of Speed Contribution of Parallelization for Several Fingerprint Pre-Processing Algorithms

    Directory of Open Access Journals (Sweden)

    GORGUNOGLU, S.

    2014-05-01

    Full Text Available In analysis of minutiae based fingerprint systems, fingerprints needs to be pre-processed. The pre-processing is carried out to enhance the quality of the fingerprint and to obtain more accurate minutiae points. Reducing the pre-processing time is important for identification and verification in real time systems and especially for databases holding large fingerprints information. Parallel processing and parallel CPU computing can be considered as distribution of processes over multi core processor. This is done by using parallel programming techniques. Reducing the execution time is the main objective in parallel processing. In this study, pre-processing of minutiae based fingerprint system is implemented by parallel processing on multi core computers using OpenMP and on graphics processor using CUDA to improve execution time. The execution times and speedup ratios are compared with the one that of single core processor. The results show that by using parallel processing, execution time is substantially improved. The improvement ratios obtained for different pre-processing algorithms allowed us to make suggestions on the more suitable approaches for parallelization.

  12. Controlling chaos in a satellite power supply subsystem

    Science.gov (United States)

    Macau, E. E. N.; Ramos Turci, L. F.; Yoneyama, T.

    2008-12-01

    In this work, we show that chaos control techniques can be used to increase the region that can be efficiently used to supply the power requests for an artificial satellite. The core of a satellite power subsystem relies on its DC/DC converter. This is a very nonlinear system that presents a multitude of phenomena ranging from bifurcations, quasi-periodicity, chaos, coexistence of attractors, among others. The traditional power subsystem design techniques try to avoid these nonlinear phenomena so that it is possible to use linear system theory in small regions about the equilibrium points. Here, we show that chaos control can be used to efficiently extend the applicability region of the satellite power subsystem when it operates in regions of high nonlinearity.

  13. On the description of subsystems in relativistic hypersurface Bohmian mechanics.

    Science.gov (United States)

    Dürr, Detlef; Lienert, Matthias

    2014-09-08

    A candidate for a realistic relativistic quantum theory is the hypersurface Bohm-Dirac model. Its formulation uses a foliation of space-time into space-like hypersurfaces. In order to apply the theory and to make contact with the usual quantum formalism, one needs a framework for the description of subsystems. The presence of spin together with the foliation renders the subsystem description more complicated than in the non-relativistic case with spin. In this paper, we provide such a framework in terms of an appropriate conditional density matrix and an effective wave function as well as clarify their relation, thereby generalizing previous subsystem descriptions in the non-relativistic case.

  14. Double Shell Tank (DST) Transfer Piping Subsystem Specification

    Energy Technology Data Exchange (ETDEWEB)

    GRAVES, C.E.

    2000-03-22

    This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Transfer Piping Subsystem that supports the first phase of Waste Feed Delivery. This specification establishes the performance requirements and provides references to the requisite codes and standards to be applied during design of the Double-Shell Tank (DST) Transfer Piping Subsystem that supports the first phase of waste feed delivery. This subsystem transfers waste between transfer-associated structures (pits) and to the River Protection Project (RPP) Privatization Contractor Facility where it will be processed into an immobilized waste form. This specification is intended to be the basis for new projects/installations (W-521, etc.). This specification is not intended to retroactively affect previously established project design criteria without specific direction by the program.

  15. The complete Heyting algebra of subsystems and contextuality

    Energy Technology Data Exchange (ETDEWEB)

    Vourdas, A. [Department of Computing, University of Bradford, Bradford BD7 1DP (United Kingdom)

    2013-08-15

    The finite set of subsystems of a finite quantum system with variables in Z(n), is studied as a Heyting algebra. The physical meaning of the logical connectives is discussed. It is shown that disjunction of subsystems is more general concept than superposition. Consequently, the quantum probabilities related to commuting projectors in the subsystems, are incompatible with associativity of the join in the Heyting algebra, unless if the variables belong to the same chain. This leads to contextuality, which in the present formalism has as contexts, the chains in the Heyting algebra. Logical Bell inequalities, which contain “Heyting factors,” are discussed. The formalism is also applied to the infinite set of all finite quantum systems, which is appropriately enlarged in order to become a complete Heyting algebra.

  16. Embedded Thermal Control for Subsystems for Next Generation Spacecraft Applications

    Science.gov (United States)

    Didion, Jeffrey R.

    2015-01-01

    Thermal Fluids and Analysis Workshop, Silver Spring MD NCTS 21070-15. NASA, the Defense Department and commercial interests are actively engaged in developing miniaturized spacecraft systems and scientific instruments to leverage smaller cheaper spacecraft form factors such as CubeSats. This paper outlines research and development efforts among Goddard Space Flight Center personnel and its several partners to develop innovative embedded thermal control subsystems. Embedded thermal control subsystems is a cross cutting enabling technology integrating advanced manufacturing techniques to develop multifunctional intelligent structures to reduce Size, Weight and Power (SWaP) consumption of both the thermal control subsystem and overall spacecraft. Embedded thermal control subsystems permit heat acquisition and rejection at higher temperatures than state of the art systems by employing both advanced heat transfer equipment (integrated heat exchangers) and high heat transfer phenomena. The Goddard Space Flight Center Thermal Engineering Branch has active investigations seeking to characterize advanced thermal control systems for near term spacecraft missions. The embedded thermal control subsystem development effort consists of fundamental research as well as development of breadboard and prototype hardware and spaceflight validation efforts. This paper will outline relevant fundamental investigations of micro-scale heat transfer and electrically driven liquid film boiling. The hardware development efforts focus upon silicon based high heat flux applications (electronic chips, power electronics etc.) and multifunctional structures. Flight validation efforts include variable gravity campaigns and a proposed CubeSat based flight demonstration of a breadboard embedded thermal control system. The CubeSat investigation is technology demonstration will characterize in long-term low earth orbit a breadboard embedded thermal subsystem and its individual components to develop

  17. Interface Supports Lightweight Subsystem Routing for Flight Applications

    Science.gov (United States)

    Lux, James P.; Block, Gary L.; Ahmad, Mohammad; Whitaker, William D.; Dillon, James W.

    2010-01-01

    A wireless avionics interface exploits the constrained nature of data networks in flight systems to use a lightweight routing method. This simplified routing means that a processor is not required, and the logic can be implemented as an intellectual property (IP) core in a field-programmable gate array (FPGA). The FPGA can be shared with the flight subsystem application. In addition, the router is aware of redundant subsystems, and can be configured to provide hot standby support as part of the interface. This simplifies implementation of flight applications requiring hot stand - by support. When a valid inbound packet is received from the network, the destination node address is inspected to determine whether the packet is to be processed by this node. Each node has routing tables for the next neighbor node to guide the packet to the destination node. If it is to be processed, the final packet destination is inspected to determine whether the packet is to be forwarded to another node, or routed locally. If the packet is local, it is sent to an Applications Data Interface (ADI), which is attached to a local flight application. Under this scheme, an interface can support many applications in a subsystem supporting a high level of subsystem integration. If the packet is to be forwarded to another node, it is sent to the outbound packet router. The outbound packet router receives packets from an ADI or a packet to be forwarded. It then uses a lookup table to determine the next destination for the packet. Upon detecting a remote subsystem failure, the routing table can be updated to autonomously bypass the failed subsystem.

  18. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  19. Subsystem Strategy Setting for Iran Local Market: Benchmarking Approach

    Directory of Open Access Journals (Sweden)

    Seyed Mohammad FATEMINEJAD

    2012-12-01

    Full Text Available In this study, we used a comparative research on the selected countries in setting the local market to identify solutions for Market Regulation Subsystem. We initially identified the system of market regulating and strategies. Based on strategic business success measures in regulating the local market, we selected six countries named the United States, Turkey, Malaysia, South Korea, China and India and identified their missions and strategic planning to regulate their local market. We presented the status of the selected countries and compared them to Iran, and made solutions to develop the subsystem for local market regulation of Iran.

  20. Study on preprocessing of surface defect images of cold steel strip

    Directory of Open Access Journals (Sweden)

    Xiaoye GE

    2016-06-01

    Full Text Available The image preprocessing is an important part in the field of digital image processing, and it’s also the premise for the image detection of cold steel strip surface defects. The factors including the complicated on-site environment and the distortion of the optical system will cause image degradation, which will directly affects the feature extraction and classification of the images. Aiming at these problems, a method combining the adaptive median filter and homomorphic filter is proposed to preprocess the image. The adaptive median filter is effective for image denoising, and the Gaussian homomorphic filter can steadily remove the nonuniform illumination of images. Finally, the original and preprocessed images and their features are analyzed and compared. The results show that this method can improve the image quality effectively.

  1. Modeling and simulation of a 100 kWe HT-PEMFC subsystem integrated with an absorption chiller subsystem

    DEFF Research Database (Denmark)

    Arsalis, Alexandros

    2012-01-01

    A 100 kWe liquid-cooled HT-PEMFC subsystem is integrated with an absorption chiller subsystem to provide electricity and cooling. The system is designed, modeled and simulated to investigate the potential of this technology for future novel energy system applications. Liquid-cooling can provide...... better temperature control and is preferable for middle-scale transport applications, such as commercial vessels, because stack cooling can be achieved within smaller volumes. A commercial ship requiring cooling and electricity is taken as the case study for the application of the proposed system. All...... electrical power output of 100 kWe. The heat exhausted to the absorption chiller subsystem is 107 kW and can satisfy a cooling duty of up to 128 or 64.5 kW for a LiBr-water double-effect system or a water-NH3 single-effect system, respectively. Finally, the projected total cost is comparable to conventional...

  2. Optimization of Preprocessing and Densification of Sorghum Stover at Full-scale Operation

    Energy Technology Data Exchange (ETDEWEB)

    Neal A. Yancey; Jaya Shankar Tumuluru; Craig C. Conner; Christopher T. Wright

    2011-08-01

    Transportation costs can be a prohibitive step in bringing biomass to a preprocessing location or biofuel refinery. One alternative to transporting biomass in baled or loose format to a preprocessing location, is to utilize a mobile preprocessing system that can be relocated to various locations where biomass is stored, preprocess and densify the biomass, then ship it to the refinery as needed. The Idaho National Laboratory has a full scale 'Process Demonstration Unit' PDU which includes a stage 1 grinder, hammer mill, drier, pellet mill, and cooler with the associated conveyance system components. Testing at bench and pilot scale has been conducted to determine effects of moisture on preprocessing, crop varieties on preprocessing efficiency and product quality. The INLs PDU provides an opportunity to test the conclusions made at the bench and pilot scale on full industrial scale systems. Each component of the PDU is operated from a central operating station where data is collected to determine power consumption rates for each step in the process. The power for each electrical motor in the system is monitored from the control station to monitor for problems and determine optimal conditions for the system performance. The data can then be viewed to observe how changes in biomass input parameters (moisture and crop type for example), mechanical changes (screen size, biomass drying, pellet size, grinding speed, etc.,), or other variations effect the power consumption of the system. Sorgum in four foot round bales was tested in the system using a series of 6 different screen sizes including: 3/16 in., 1 in., 2 in., 3 in., 4 in., and 6 in. The effect on power consumption, product quality, and production rate were measured to determine optimal conditions.

  3. Boosting model performance and interpretation by entangling preprocessing selection and variable selection.

    Science.gov (United States)

    Gerretzen, Jan; Szymańska, Ewa; Bart, Jacob; Davies, Antony N; van Manen, Henk-Jan; van den Heuvel, Edwin R; Jansen, Jeroen J; Buydens, Lutgarde M C

    2016-09-28

    The aim of data preprocessing is to remove data artifacts-such as a baseline, scatter effects or noise-and to enhance the contextually relevant information. Many preprocessing methods exist to deliver one or more of these benefits, but which method or combination of methods should be used for the specific data being analyzed is difficult to select. Recently, we have shown that a preprocessing selection approach based on Design of Experiments (DoE) enables correct selection of highly appropriate preprocessing strategies within reasonable time frames. In that approach, the focus was solely on improving the predictive performance of the chemometric model. This is, however, only one of the two relevant criteria in modeling: interpretation of the model results can be just as important. Variable selection is often used to achieve such interpretation. Data artifacts, however, may hamper proper variable selection by masking the true relevant variables. The choice of preprocessing therefore has a huge impact on the outcome of variable selection methods and may thus hamper an objective interpretation of the final model. To enhance such objective interpretation, we here integrate variable selection into the preprocessing selection approach that is based on DoE. We show that the entanglement of preprocessing selection and variable selection not only improves the interpretation, but also the predictive performance of the model. This is achieved by analyzing several experimental data sets of which the true relevant variables are available as prior knowledge. We show that a selection of variables is provided that complies more with the true informative variables compared to individual optimization of both model aspects. Importantly, the approach presented in this work is generic. Different types of models (e.g. PCR, PLS, …) can be incorporated into it, as well as different variable selection methods and different preprocessing methods, according to the taste and experience of

  4. Genetic Algorithm for Optimization: Preprocessing with n Dimensional Bisection and Error Estimation

    Science.gov (United States)

    Sen, S. K.; Shaykhian, Gholam Ali

    2006-01-01

    A knowledge of the appropriate values of the parameters of a genetic algorithm (GA) such as the population size, the shrunk search space containing the solution, crossover and mutation probabilities is not available a priori for a general optimization problem. Recommended here is a polynomial-time preprocessing scheme that includes an n-dimensional bisection and that determines the foregoing parameters before deciding upon an appropriate GA for all problems of similar nature and type. Such a preprocessing is not only fast but also enables us to get the global optimal solution and its reasonably narrow error bounds with a high degree of confidence.

  5. Performance of Pre-processing Schemes with Imperfect Channel State Information

    DEFF Research Database (Denmark)

    Christensen, Søren Skovgaard; Kyritsi, Persa; De Carvalho, Elisabeth

    2006-01-01

    Pre-processing techniques have several benefits when the CSI is perfect. In this work we investigate three linear pre-processing filters, assuming imperfect CSI caused by noise degradation and channel temporal variation. Results indicate, that the LMMSE filter achieves the lowest BER and the high...... and the highest SINR when the CSI is perfect, whereas the simple matched filter may be a good choice when the CSI is imperfect. Additionally the results give insight into the inherent trade-off between robustness against CSI imperfections and spatial focusing ability....

  6. ACTS (Advanced Communications Technology Satellite) Propagation Experiment: Preprocessing Software User's Manual

    Science.gov (United States)

    Crane, Robert K.; Wang, Xuhe; Westenhaver, David

    1996-01-01

    The preprocessing software manual describes the Actspp program originally developed to observe and diagnose Advanced Communications Technology Satellite (ACTS) propagation terminal/receiver problems. However, it has been quite useful for automating the preprocessing functions needed to convert the terminal output to useful attenuation estimates. Prior to having data acceptable for archival functions, the individual receiver system must be calibrated and the power level shifts caused by ranging tone modulation must be received. Actspp provides three output files: the daylog, the diurnal coefficient file, and the file that contains calibration information.

  7. Data acquisition, preprocessing and analysis for the Virginia Tech OLYMPUS experiment

    Science.gov (United States)

    Remaklus, P. Will

    1991-01-01

    Virginia Tech is conducting a slant path propagation experiment using the 12, 20, and 30 GHz OLYMPUS beacons. Beacon signal measurements are made using separate terminals for each frequency. In addition, short baseline diversity measurements are collected through a mobile 20 GHz terminal. Data collection is performed with a custom data acquisition and control system. Raw data are preprocessed to remove equipment biases and discontinuities prior to analysis. Preprocessed data are then statistically analyzed to investigate parameters such as frequency scaling, fade slope and duration, and scintillation intensity.

  8. Preprocessing of Tandem Mass Spectrometric Data Based on Decision Tree Classification

    Institute of Scientific and Technical Information of China (English)

    Jing-Fen Zhang; Si-Min He; Jin-Jin Cai; Xing-Jun Cao; Rui-Xiang Sun; Yan Fu; Rong Zeng; Wen Gao

    2005-01-01

    In this study, we present a preprocessing method for quadrupole time-of-flight(Q-TOF) tandem mass spectra to increase the accuracy of database searching for peptide (protein) identification. Based on the natural isotopic information inherent in tandem mass spectra, we construct a decision tree after feature selection to classify the noise and ion peaks in tandem spectra. Furthermore, we recognize overlapping peaks to find the monoisotopic masses of ions for the following identification process. The experimental results show that this preprocessing method increases the search speed and the reliability of peptide identification.

  9. Influence of Hemp Fibers Pre-processing on Low Density Polyethylene Matrix Composites Properties

    Science.gov (United States)

    Kukle, S.; Vidzickis, R.; Zelca, Z.; Belakova, D.; Kajaks, J.

    2016-04-01

    In present research with short hemp fibres reinforced LLDPE matrix composites with fibres content in a range from 30 to 50 wt% subjected to four different pre-processing technologies were produced and such their properties as tensile strength and elongation at break, tensile modulus, melt flow index, micro hardness and water absorption dynamics were investigated. Capillary viscosimetry was used for fluidity evaluation and melt flow index (MFI) evaluated for all variants. MFI of fibres of two pre-processing variants were high enough to increase hemp fibres content from 30 to 50 wt% with moderate increase of water sorption capability.

  10. Treating the sibling subsystem: an adjunct of divorce therapy.

    Science.gov (United States)

    Schibuk, M

    1989-04-01

    Sibling therapy, frequently overlooked as a method of treatment, is particularly appropriate in situations that require a deliberate focus on the "unit of continuity," or the subsystem that remains intact during a process of family reorganization. For this and other reasons it can be an effective tool in treating children of divorce. A case illustrating this use of sibling therapy is presented.

  11. Cascade Distillation Subsystem Development: Progress Toward a Distillation Comparison Test

    Science.gov (United States)

    Callahan, M. R.; Lubman, A.; Pickering, Karen D.

    2009-01-01

    Recovery of potable water from wastewater is essential for the success of long-duration manned missions to the Moon and Mars. Honeywell International and a team from NASA Johnson Space Center (JSC) are developing a wastewater processing subsystem that is based on centrifugal vacuum distillation. The wastewater processor, referred to as the Cascade Distillation Subsystem (CDS), utilizes an innovative and efficient multistage thermodynamic process to produce purified water. The rotary centrifugal design of the system also provides gas/liquid phase separation and liquid transport under microgravity conditions. A five-stage subsystem unit has been designed, built, delivered and integrated into the NASA JSC Advanced Water Recovery Systems Development Facility for performance testing. A major test objective of the project is to demonstrate the advancement of the CDS technology from the breadboard level to a subsystem level unit. An initial round of CDS performance testing was completed in fiscal year (FY) 2008. Based on FY08 testing, the system is now in development to support an Exploration Life Support (ELS) Project distillation comparison test expected to begin in early 2009. As part of the project objectives planned for FY09, the system will be reconfigured to support the ELS comparison test. The CDS will then be challenged with a series of human-gene-rated waste streams representative of those anticipated for a lunar outpost. This paper provides a description of the CDS technology, a status of the current project activities, and data on the system s performance to date.

  12. Mark 4A DSN receiver-exciter and transmitter subsystems

    Science.gov (United States)

    Wick, M. R.

    1986-01-01

    The present configuration of the Mark 4A DSN Receiver-Exciter and Transmitter Subsystems is described. Functional requirements and key characteristics are given to show the differences in the capabilities required by the Networks Consolidation task for combined High Earth Orbiter and Deep Space Network tracking support.

  13. The OCLC Serials Sub-System: A First Evaluation.

    Science.gov (United States)

    Edgar, Neal L.; And Others

    This examination of the OCLC serials control sub-system points to positive and negative aspects of the OCLC system as they relate to serials, and evaluates the system's serials cataloging capabilities. While this report assumes a knowledge of the basic operations of OCLC, it describes the system in general, its function in cataloging, and its…

  14. Image Processing In Laser-Beam-Steering Subsystem

    Science.gov (United States)

    Lesh, James R.; Ansari, Homayoon; Chen, Chien-Chung; Russell, Donald W.

    1996-01-01

    Conceptual design of image-processing circuitry developed for proposed tracking apparatus described in "Beam-Steering Subsystem For Laser Communication" (NPO-19069). In proposed system, desired frame rate achieved by "windowed" readout scheme in which only pixels containing and surrounding two spots read out and others skipped without being read. Image data processed rapidly and efficiently to achieve high frequency response.

  15. Interconnection of subsystems in closed-loop systems

    DEFF Research Database (Denmark)

    Niemann, Hans Henrik; Poulsen, Niels Kjølstad

    2009-01-01

    The focus in this paper is analysis of stability and controller design for interconnected systems. This includes both the case with known and unknown interconnected sub-system. The key element in both the stability analysis and controller design is the application of the Youla-Jabr-Bongiorno-Kuce...

  16. The Design of a High-Integrity Disk Management Subsystem

    NARCIS (Netherlands)

    Oey, M.A.

    2005-01-01

    This dissertation describes and experimentally evaluates the design of the Logical Disk, a disk management subsystem that guarantees the integrity of data stored on disk even after system failures, while still providing performance competitive to other storage systems. Current storage systems that

  17. High-energy coordination polymers (CPs) exhibiting good catalytic effect on the thermal decomposition of ammonium dinitramide

    Science.gov (United States)

    Li, Xin; Han, Jing; Zhang, Sheng; Zhai, Lianjie; Wang, Bozhou; Yang, Qi; Wei, Qing; Xie, Gang; Chen, Sanping; Gao, Shengli

    2017-09-01

    High-energy coordination polymers (CPs) not only exhibit good energetic performances but also have a good catalytic effect on the thermal decomposition of energetic materials. In this contribution, two high-energy CPs Cu2(DNBT)2(CH3OH)(H2O)3·3H2O (1) and [Cu3(DDT)2(H2O)2]n (2) (H2DNBT = 3,3‧-dinitro-5,5‧-bis(1H-1,2,4-triazole and H3DDT = 4,5-bis(1H-tetrazol-5-yl)-2H-1,2,3-triazole) were synthesized and structurally characterized. Furthermore, 1 was thermos-dehydrated to produce Cu2(DNBT)2(CH3OH)(H2O)3 (1a). The thermal decomposition kinetics of 1, 1a and 2 were studied by Kissinger's method and Ozawa's method. Thermal analyses and sensitivity tests show that all compounds exhibit high thermal stability and low sensitivity for external stimuli. Meanwhile, all compounds have large positive enthalpy of formation, which are calculated as being (1067.67 ± 2.62) kJ mol-1 (1), (1464.12 ± 3.12) kJ mol-1 (1a) and (3877.82 ± 2.75) kJ mol-1 (2), respectively. The catalytic effects of 1a and 2 on the thermal decomposition of ammonium dinitramide (ADN) were also investigated.

  18. The Classroom Performance System (CPS): Effects on student participation, attendance, and achievement in multicultural anatomy and physiology classes at South Texas College

    Science.gov (United States)

    Termos, Mohamad Hani

    2011-12-01

    The Classroom Performance System (CPS) is an instructional technology tool that increases student performance and addresses different learning styles. Instructional technologies are used to promote active learning; however, student embarrassment issue in a multicultural setting is not addressed. This study assessed the effect of the CPS on student participation, attendance, and achievement in multicultural college-level anatomy and physiology classes at South Texas College, where the first spoken language is not English. Quantitative method and quasi-experimental design were employed and comparative statistic methods and pre-post tests were used to collect the data. Participants were college students and sections of study were selected by convenient sampling. Participation was 100% during most of the lectures held and participation rate did not strike above 68% in control group. Attendance was significantly higher in CPS sections than the control group as shown by t-tests. Experimental sections had a higher increase in the pre-post test scores and student averages on lecture exams increased at a higher rate as compared to the control group. Therefore, the CPS increased student participation, attendance, and achievement in multicultural anatomy and physiology classes. The CPS can be studied in other settings where the first spoken language is English or in other programs, such as special education programs. Additionally, other variables can be studied and other methodologies can be employed.

  19. Dispersion of Short- and Medium-Chain Chlorinated Paraffins (CPs) from a CP Production Plant to the Surrounding Surface Soils and Coniferous Leaves.

    Science.gov (United States)

    Xu, Jiazhi; Gao, Yuan; Zhang, Haijun; Zhan, Faqiang; Chen, Jiping

    2016-12-06

    Chlorinated paraffin (CP) production is one important emission source for short- and medium-chain CPs (SCCPs and MCCPs) in the environment. In this study, 48 CP congener groups were measured in the surface soils and coniferous leaves collected from the inner and surrounding environment of a CP production plant that has been in operation for more than 30 years to investigate the dispersion and deposition behavior of SCCPs and MCCPs. The average concentrations of the sum of SCCPs and MCCPs in the in-plant coniferous leaves and surface soils were 4548.7 ng g(-1) dry weight (dw) and 3481.8 ng g(-1) dw, which were 2-fold and 10-fold higher than those in the surrounding environment, respectively. The Gaussian air pollution model explained the spatial distribution of CPs in the coniferous leaves, whereas the dispersion of CPs to the surrounding surface soils fits the Boltzmann equation well. Significant fractionation effect was observed for the atmospheric dispersion of CPs from the production plant. CP congener groups with higher octanol-air partitioning coefficients (KOA) were more predominant in the in-plant environment, whereas the ones with lower KOA values had the elevated proportion in the surrounding environment. A radius of approximately 4 km from the CP production plant was influenced by the atmospheric dispersion and deposition of CPs.

  20. A Real-Time Embedded System for Stereo Vision Preprocessing Using an FPGA

    DEFF Research Database (Denmark)

    Kjær-Nielsen, Anders; Jensen, Lars Baunegaard With; Sørensen, Anders Stengaard

    2008-01-01

    In this paper a low level vision processing node for use in existing IEEE 1394 camera setups is presented. The processing node is a small embedded system, that utilizes an FPGA to perform stereo vision preprocessing at rates limited by the bandwidth of IEEE 1394a (400Mbit). The system is used...

  1. Evaluation of Microarray Preprocessing Algorithms Based on Concordance with RT-PCR in Clinical Samples

    DEFF Research Database (Denmark)

    Hansen, Kasper Lage; Szallasi, Zoltan Imre; Eklund, Aron Charles

    2009-01-01

    evaluated consistency using the Pearson correlation between measurements obtained on the two platforms. Also, we introduce the log-ratio discrepancy as a more relevant measure of discordance between gene expression platforms. Of nine preprocessing algorithms tested, PLIER+16 produced expression values...

  2. Scene matching based on non-linear pre-processing on reference image and sensed image

    Institute of Scientific and Technical Information of China (English)

    Zhong Sheng; Zhang Tianxu; Sang Nong

    2005-01-01

    To solve the heterogeneous image scene matching problem, a non-linear pre-processing method for the original images before intensity-based correlation is proposed. The result shows that the proper matching probability is raised greatly. Especially for the low S/N image pairs, the effect is more remarkable.

  3. A New Endmember Preprocessing Method for the Hyperspectral Unmixing of Imagery Containing Marine Oil Spills

    Directory of Open Access Journals (Sweden)

    Can Cui

    2017-09-01

    Full Text Available The current methods that use hyperspectral remote sensing imagery to extract and monitor marine oil spills are quite popular. However, the automatic extraction of endmembers from hyperspectral imagery remains a challenge. This paper proposes a data field-spectral preprocessing (DSPP algorithm for endmember extraction. The method first derives a set of extreme points from the data field of an image. At the same time, it identifies a set of spectrally pure points in the spectral space. Finally, the preprocessing algorithm fuses the data field with the spectral calculation to generate a new subset of endmember candidates for the following endmember extraction. The processing time is greatly shortened by directly using endmember extraction algorithms. The proposed algorithm provides accurate endmember detection, including the detection of anomalous endmembers. Therefore, it has a greater accuracy, stronger noise resistance, and is less time-consuming. Using both synthetic hyperspectral images and real airborne hyperspectral images, we utilized the proposed preprocessing algorithm in combination with several endmember extraction algorithms to compare the proposed algorithm with the existing endmember extraction preprocessing algorithms. The experimental results show that the proposed method can effectively extract marine oil spill data.

  4. affyPara-a Bioconductor Package for Parallelized Preprocessing Algorithms of Affymetrix Microarray Data.

    Science.gov (United States)

    Schmidberger, Markus; Vicedo, Esmeralda; Mansmann, Ulrich

    2009-07-22

    Microarray data repositories as well as large clinical applications of gene expression allow to analyse several hundreds of microarrays at one time. The preprocessing of large amounts of microarrays is still a challenge. The algorithms are limited by the available computer hardware. For example, building classification or prognostic rules from large microarray sets will be very time consuming. Here, preprocessing has to be a part of the cross-validation and resampling strategy which is necessary to estimate the rule's prediction quality honestly.This paper proposes the new Bioconductor package affyPara for parallelized preprocessing of Affymetrix microarray data. Partition of data can be applied on arrays and parallelization of algorithms is a straightforward consequence. The partition of data and distribution to several nodes solves the main memory problems and accelerates preprocessing by up to the factor 20 for 200 or more arrays.affyPara is a free and open source package, under GPL license, available form the Bioconductor project at www.bioconductor.org. A user guide and examples are provided with the package.

  5. Pre-processing filter design at transmitters for IBI mitigation in an OFDM system

    Institute of Scientific and Technical Information of China (English)

    Xia Wang; Lei Wang

    2013-01-01

    In order to meet the demands for high transmission rates and high service quality in broadband wireless communica-tion systems, orthogonal frequency division multiplexing (OFDM) has been adopted in some standards. However, the inter-block interference (IBI) and inter-carrier interference (ICI) in an OFDM system affect the performance. To mitigate IBI and ICI, some pre-processing approaches have been proposed based on ful channel state information (CSI), which improved the system per-formance. A pre-processing filter based on partial CSI at the trans-mitter is designed and investigated. The filter coefficient is given by the optimization processing, the symbol error rate (SER) is tested, and the computation complexity of the proposed scheme is analyzed. Computer simulation results show that the proposed pre-processing filter can effectively mitigate IBI and ICI and the performance can be improved. Compared with pre-processing approaches at the transmitter based on ful CSI, the proposed scheme has high spectral efficiency, limited CSI feedback and low computation complexity.

  6. Inter-Rater Reliability of Preprocessing EEG Data: Impact of Subjective Artifact Removal on Associative Memory Task ERP Results

    Directory of Open Access Journals (Sweden)

    Steven D. Shirk

    2017-06-01

    Full Text Available The processing of EEG data routinely involves subjective removal of artifacts during a preprocessing stage. Preprocessing inter-rater reliability (IRR and how differences in preprocessing may affect outcomes of primary event-related potential (ERP analyses has not been previously assessed. Three raters independently preprocessed EEG data of 16 cognitively healthy adult participants (ages 18–39 years who performed a memory task. Using intraclass correlations (ICCs, IRR was assessed for Early-frontal, Late-frontal, and Parietal Old/new memory effects contrasts across eight regions of interest (ROIs. IRR was good to excellent for all ROIs; 22 of 26 ICCs were above 0.80. Raters were highly consistent in preprocessing across ROIs, although the frontal pole ROI (ICC range 0.60–0.90 showed less consistency. Old/new parietal effects had highest ICCs with the lowest variability. Rater preprocessing differences did not alter primary ERP results. IRR for EEG preprocessing was good to excellent, and subjective rater-removal of EEG artifacts did not alter primary memory-task ERP results. Findings provide preliminary support for robustness of cognitive/memory task-related ERP results against significant inter-rater preprocessing variability and suggest reliability of EEG to assess cognitive-neurophysiological processes multiple preprocessors are involved.

  7. Predictive modeling of colorectal cancer using a dedicated pre-processing pipeline on routine electronic medical records

    NARCIS (Netherlands)

    Kop, Reinier; Hoogendoorn, Mark; Teije, Annette Ten; Büchner, Frederike L; Slottje, Pauline; Moons, Leon M G; Numans, Mattijs E

    2016-01-01

    Over the past years, research utilizing routine care data extracted from Electronic Medical Records (EMRs) has increased tremendously. Yet there are no straightforward, standardized strategies for pre-processing these data. We propose a dedicated medical pre-processing pipeline aimed at taking on

  8. Reproducible cancer biomarker discovery in SELDI-TOF MS using different pre-processing algorithms.

    Directory of Open Access Journals (Sweden)

    Jinfeng Zou

    Full Text Available BACKGROUND: There has been much interest in differentiating diseased and normal samples using biomarkers derived from mass spectrometry (MS studies. However, biomarker identification for specific diseases has been hindered by irreproducibility. Specifically, a peak profile extracted from a dataset for biomarker identification depends on a data pre-processing algorithm. Until now, no widely accepted agreement has been reached. RESULTS: In this paper, we investigated the consistency of biomarker identification using differentially expressed (DE peaks from peak profiles produced by three widely used average spectrum-dependent pre-processing algorithms based on SELDI-TOF MS data for prostate and breast cancers. Our results revealed two important factors that affect the consistency of DE peak identification using different algorithms. One factor is that some DE peaks selected from one peak profile were not detected as peaks in other profiles, and the second factor is that the statistical power of identifying DE peaks in large peak profiles with many peaks may be low due to the large scale of the tests and small number of samples. Furthermore, we demonstrated that the DE peak detection power in large profiles could be improved by the stratified false discovery rate (FDR control approach and that the reproducibility of DE peak detection could thereby be increased. CONCLUSIONS: Comparing and evaluating pre-processing algorithms in terms of reproducibility can elucidate the relationship among different algorithms and also help in selecting a pre-processing algorithm. The DE peaks selected from small peak profiles with few peaks for a dataset tend to be reproducibly detected in large peak profiles, which suggests that a suitable pre-processing algorithm should be able to produce peaks sufficient for identifying useful and reproducible biomarkers.

  9. Data preprocessing methods of FT-NIR spectral data for the classification cooking oil

    Science.gov (United States)

    Ruah, Mas Ezatul Nadia Mohd; Rasaruddin, Nor Fazila; Fong, Sim Siong; Jaafar, Mohd Zuli

    2014-12-01

    This recent work describes the data pre-processing method of FT-NIR spectroscopy datasets of cooking oil and its quality parameters with chemometrics method. Pre-processing of near-infrared (NIR) spectral data has become an integral part of chemometrics modelling. Hence, this work is dedicated to investigate the utility and effectiveness of pre-processing algorithms namely row scaling, column scaling and single scaling process with Standard Normal Variate (SNV). The combinations of these scaling methods have impact on exploratory analysis and classification via Principle Component Analysis plot (PCA). The samples were divided into palm oil and non-palm cooking oil. The classification model was build using FT-NIR cooking oil spectra datasets in absorbance mode at the range of 4000cm-1-14000cm-1. Savitzky Golay derivative was applied before developing the classification model. Then, the data was separated into two sets which were training set and test set by using Duplex method. The number of each class was kept equal to 2/3 of the class that has the minimum number of sample. Then, the sample was employed t-statistic as variable selection method in order to select which variable is significant towards the classification models. The evaluation of data pre-processing were looking at value of modified silhouette width (mSW), PCA and also Percentage Correctly Classified (%CC). The results show that different data processing strategies resulting to substantial amount of model performances quality. The effects of several data pre-processing i.e. row scaling, column standardisation and single scaling process with Standard Normal Variate indicated by mSW and %CC. At two PCs model, all five classifier gave high %CC except Quadratic Distance Analysis.

  10. Value of Distributed Preprocessing of Biomass Feedstocks to a Bioenergy Industry

    Energy Technology Data Exchange (ETDEWEB)

    Christopher T Wright

    2006-07-01

    Biomass preprocessing is one of the primary operations in the feedstock assembly system and the front-end of a biorefinery. Its purpose is to chop, grind, or otherwise format the biomass into a suitable feedstock for conversion to ethanol and other bioproducts. Many variables such as equipment cost and efficiency, and feedstock moisture content, particle size, bulk density, compressibility, and flowability affect the location and implementation of this unit operation. Previous conceptual designs show this operation to be located at the front-end of the biorefinery. However, data are presented that show distributed preprocessing at the field-side or in a fixed preprocessing facility can provide significant cost benefits by producing a higher value feedstock with improved handling, transporting, and merchandising potential. In addition, data supporting the preferential deconstruction of feedstock materials due to their bio-composite structure identifies the potential for significant improvements in equipment efficiencies and compositional quality upgrades. Theses data are collected from full-scale low and high capacity hammermill grinders with various screen sizes. Multiple feedstock varieties with a range of moisture values were used in the preprocessing tests. The comparative values of the different grinding configurations, feedstock varieties, and moisture levels are assessed through post-grinding analysis of the different particle fractions separated with a medium-scale forage particle separator and a Rototap separator. The results show that distributed preprocessing produces a material that has bulk flowable properties and fractionation benefits that can improve the ease of transporting, handling and conveying the material to the biorefinery and improve the biochemical and thermochemical conversion processes.

  11. A comprehensive analysis about the influence of low-level preprocessing techniques on mass spectrometry data for sample classification.

    Science.gov (United States)

    López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Fernández-Riverola, Florentino

    2014-01-01

    Matrix-Assisted Laser Desorption Ionisation Time-of-Flight (MALDI-TOF) is one of the high-throughput mass spectrometry technologies able to produce data requiring an extensive preprocessing before subsequent analyses. In this context, several low-level preprocessing techniques have been successfully developed for different tasks, including baseline correction, smoothing, normalisation, peak detection and peak alignment. In this work, we present a systematic comparison of different software packages aiding in the compulsory preprocessing of MALDI-TOF data. In order to guarantee the validity of our study, we test multiple configurations of each preprocessing technique that are subsequently used to train a set of classifiers whose performance (kappa and accuracy) provide us accurate information for the final comparison. Results from experiments show the real impact of preprocessing techniques on classification, evidencing that MassSpecWavelet provides the best performance and Support Vector Machines (SVM) are one of the most accurate classifiers.

  12. CPS/SEG BEIJING 2004 INTERNATIONAL GEOPHYSICAL CONFERENCE AND EXPOSITIONHOLD IN BEIJING%CPS/SEG北京2004国际地球物理学术会议暨展览在京举行

    Institute of Scientific and Technical Information of China (English)

    张平平; 常旭

    2004-01-01

    2004年3月31日至4月3日,CPS/SEG北京2004国际地球物理会议暨展览在北京国际会议中心举行.自1978年以来,本次大会是勘探地球物理学界在中国举办的最大的一次国际会议,会议由中国石油学会(CPS)和美国勘探地球物理学家学会(SEG)联合举办,以地球物理如何满足油气勘探的高难度化、满足全球日益增长的能源需求为本次会议主题,针对具有巨大挑

  13. Does adopting a prenatal substance use protocol reduce racial disparities in CPS reporting related to maternal drug use? A California case study.

    Science.gov (United States)

    Roberts, S C M; Zahnd, E; Sufrin, C; Armstrong, M A

    2015-02-01

    This study examined whether adopting a standardized prenatal substance use protocol (protocol) in a hospital labor and delivery unit reduced racial disparities in reporting to child protective services (CPS) related to maternal drug use during pregnancy. This study used an interrupted time series design with a non-equivalent control. One hospital adopted a protocol and another hospital group serving a similar geographic population did not change protocols. Data on CPS reporting disparities from these hospitals over 3.5 years were analyzed using segmented regression. In the hospital that adopted the protocol, almost five times more black than white newborns were reported during the study period. Adopting the protocol was not associated with reduced disparities. Adopting a protocol cannot be assumed to reduce CPS reporting disparities. Efforts to encourage hospitals to adopt protocols as a strategy to reduce disparities may be misguided. Other strategies to reduce disparities are needed.

  14. Japan Meteorological Agency/Meteorological Research Institute-Coupled Prediction System version 1 (JMA/MRI-CPS1) for operational seasonal forecasting

    Science.gov (United States)

    Takaya, Yuhei; Yasuda, Tamaki; Fujii, Yosuke; Matsumoto, Satoshi; Soga, Taizo; Mori, Hirotoshi; Hirai, Masayuki; Ishikawa, Ichiro; Sato, Hitoshi; Shimpo, Akihiko; Kamachi, Masafumi; Ose, Tomoaki

    2017-01-01

    This paper describes the operational seasonal prediction system of the Japan Meteorological Agency (JMA), the Japan Meteorological Agency/Meteorological Research Institute-Coupled Prediction System version 1 (JMA/MRI-CPS1), which was in operation at JMA during the period between February 2010 and May 2015. The predictive skill of the system was assessed with a set of retrospective seasonal predictions (reforecasts) covering 30 years (1981-2010). JMA/MRI-CPS1 showed reasonable predictive skill for the El Niño-Southern Oscillation, comparable to the skills of other state-of-the-art systems. The one-tiered approach adopted in JMA/MRI-CPS1 improved its overall predictive skills for atmospheric predictions over those of the two-tiered approach of the previous uncoupled system. For 3-month predictions with a 1-month lead, JMA/MRI-CPS1 showed statistically significant skills in predicting 500-hPa geopotential height and 2-m temperature in East Asia in most seasons; thus, it is capable of providing skillful seasonal predictions for that region. Furthermore, JMA/MRI-CPS1 was superior overall to the previous system for atmospheric predictions with longer (4-month) lead times. In particular, JMA/MRI-CPS1 was much better able to predict the Asian Summer Monsoon than the previous two-tiered system. This enhanced performance was attributed to the system's ability to represent atmosphere-ocean coupled variability over the Indian Ocean and the western North Pacific from boreal winter to summer following winter El Niño events, which in turn influences the East Asian summer climate through the Pacific-Japan teleconnection pattern. These substantial improvements obtained by using an atmosphere-ocean coupled general circulation model underpin its success in providing more skillful seasonal forecasts on an operational basis.

  15. [The innovative dynamic of the mechanics, electronics and materials subsystem].

    Science.gov (United States)

    Maldonado, José; Gadelha, Carlos Augusto Grabois; Costa, Laís Silveira; Vargas, Marco

    2012-12-01

    The mechanics, electronics and materials subsystem, one of the subsystems of the health care productive complex, encompasses different activities, usually clustered in what is called the medical, hospital and dental equipment and materials industry. This is a strategic area for health care, since it represents a continuous source of changes in care practices, and influences the provision of health care services. It has, moreover, potential for promoting the progress of Brazil's system of innovation and for increasing the competitiveness of the industry as a whole, given that it articulates future technologies. Despite the significant growth of this industry in Brazil in recent years, such equipment and materials have been presenting a growing deficit in the balance of trade. This incompatibility between national health care needs and the productive and innovative basis of the industry points to structural fragilities in the system. Using the framework of political economy, the article aims to discuss the development of this industry in Brazil and its challenges.

  16. Categorial Subsystem Independence as Morphism Co-possibility

    Science.gov (United States)

    Gyenis, Zalán; Rédei, Miklós

    2017-08-01

    This paper formulates a notion of independence of subobjects of an object in a general (i.e., not necessarily concrete) category. Subobject independence is the categorial generalization of what is known as subsystem independence in the context of algebraic relativistic quantum field theory. The content of subobject independence formulated in this paper is morphism co-possibility: two subobjects of an object will be defined to be independent if any two morphisms on the two subobjects of an object are jointly implementable by a single morphism on the larger object. The paper investigates features of subobject independence in general, and subobject independence in the category of C*-algebras with respect to operations (completely positive unit preserving linear maps on C*-algebras) as morphisms is suggested as a natural subsystem independence axiom to express relativistic locality of the covariant functor in the categorial approach to quantum field theory.

  17. CONTROLLABILITY OF DELAY DEGENERATE CONTROL SYSTEMS WITH INDEPENDENT SUBSYSTEMS

    Institute of Scientific and Technical Information of China (English)

    蒋威

    2003-01-01

    The controllability of delay degenerate differential control systems is discussed. Firstly, delay degenerate differential control system was transformed to be canonical form, and the connected terms were gotten rid of, had delay degenerate differential control systems with independent subsystems. For the general delay degenerate differnetial control systems, it was gotten that the necessary and sufficient condition of that they are controllable is that their reachable set is equal to the whole space For the delay degenerate differential control systems with independent subsystems, it was gotten that the necessary and sufficient conditions of that they are controllable are that their reachable sets are equal to their corresponding subspaces. Then some algebra criteria were gotten. Finally, an example was given to illustrate the main results.

  18. H∞ Optimal Model Reduction for Singular Fast Subsystems

    Institute of Scientific and Technical Information of China (English)

    WANGJing; ZHANGQing-Ling; LIUWan-Quan; ZHOUYue

    2005-01-01

    In this paper, H∞ optimal model reduction for singular fast subsystems will be investigated. First, error system is established to measure the error magnitude between the original and reduced systems, and it is demonstrated that the new feature for model reduction of singular systems is to make H∞ norm of the error system finite and minimal. The necessary and sufficient condition is derived for the existence of the H∞ suboptimal model reduction problem. Next, we give an exactand practicable algorithm to get the parameters of the reduced subsystems by applying the matrix theory. Meanwhile, the reduced system may be also impulsive. The advantages of the proposed algorithm are that it is more flexible in a straight-forward way without much extra computation, and the order of the reduced systems is as minimal as possible. Finally, one illustrative example is given to illustrate the effectiveness of the proposed model reduction approach.

  19. Measurement system as a subsystem of the quality management system

    Directory of Open Access Journals (Sweden)

    Ľubica Floreková

    2006-12-01

    Full Text Available Each measurement system and a control principle must be based on certain facts about the system behaviour (what, operation (how and structure (why. Each system is distributed into subsystems that provide an input for the next subsystem. For each system, start is important the begin, that means system characteristics, collecting of data, its hierarchy and the processes distribution.A measurement system (based on the chapter 8 of the standard ISO 9001:2000 Quality management system, requirements defines the measurement, analysis and improvement for each organization in order to present the products conformity, the quality management system conformity guarantee and for the continuously permanent improvement of effectivity, efficiency and economy of quality management system.

  20. The skeletal subsystem as an integrative physiology paradigm.

    Science.gov (United States)

    Weiss, Aaron J; Iqbal, Jameel; Zaidi, Neeha; Mechanick, Jeffrey I

    2010-12-01

    Homeostatic bone remodeling depends on precise regulation of osteoblast-osteoclast coupling through intricate endocrine, immune, neuronal, and mechanical factors. The osteoblast-osteoclast model of bone physiology with layers of regulatory complexity can be investigated as a component of a local skeletal subsystem or as a part of a complete whole-body system. In this review, we flip the traditional investigative paradigm of scientific experimentation ("bottom-top research") to a "top-bottom" approach using systems biology. We first establish the intricacies of the two-cell model at the molecular signaling level. We then provide, on a systems level, an integrative physiologic approach involving many recognized organ-level subsystems having direct and/or indirect effects on bone remodeling. Lastly, a hypothetical model of bone remodeling based on frequency and amplitude regulatory mechanisms is presented. It is hoped that by providing a thorough model of skeletal homeostasis, future progress can be made in researching and treating skeletal morbidities.

  1. Pyrotechnic Actuator for Retracting Tubes Between MSL Subsystems

    Science.gov (United States)

    Gallon, John C.; Webster, Richard G.; Patterson, Keith D.; Orzewalla, Matthew A.; Roberts, Eric T.; Tuszynski, Andrew J.

    2010-01-01

    An apparatus, denoted the "retractuator" (a contraction of "retracting actuator"), was designed to help ensure clean separation between the cruise stage and the entry-vehicle subsystem of the Mars Science Laboratory (MSL) mission. The retractuator or an equivalent mechanism is needed because of tubes that (1) transport a heat-transfer fluid between the stages during flight and (2) are cut immediately prior to separation of the stages retractuator. The role of the retractuator is to retract the tubes, after they are cut and before separation of the subsystem, so that cut ends of the tubes do not damage thermal-protection coats on the entry vehicle and do not contribute to uncertainty of drag and consequent uncertainty in separation velocity.

  2. System integration of marketable subsystems. [for residential solar heating and cooling

    Science.gov (United States)

    1979-01-01

    Progress is reported in the following areas: systems integration of marketable subsystems; development, design, and building of site data acquisition subsystems; development and operation of the central data processing system; operation of the MSFC Solar Test Facility; and systems analysis.

  3. Photovoltaic subsystem optimization and design tradeoff study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Stolte, W.J.

    1982-03-01

    Tradeoffs and subsystem choices are examined in photovoltaic array subfield design, power-conditioning sizing and selection, roof- and ground-mounted structure installation, energy loss, operating voltage, power conditioning cost, and subfield size. Line- and self-commutated power conditioning options are analyzed to determine the most cost-effective technology in the megawatt power range. Methods for reducing field installation of flat panels and roof mounting of intermediate load centers are discussed, including the cost of retrofit installations.

  4. Nonadiabatic Geometric Phase in Composite Systems and Its Subsystem

    Institute of Scientific and Technical Information of China (English)

    LI Xin

    2008-01-01

    We point out that the time-dependent gauge transformation technique may be effective in investigating the nonadiabatic geometric phase of a subsystem in a composite system. As an example, we consider two uniaxially coupled spin -1/2 particles with one of particles driven by rotating magnetic field. The influences of coupling and precession frequency of the magnetic field on geometric phase are also discussed in detail.

  5. Photonics in switching: enabling technologies and subsystem design

    DEFF Research Database (Denmark)

    Vlachos, K.; Raffaelli, C.; Aleksic, S.

    2009-01-01

    This paper describes recent research activities and results in the area of photonic switching carried out within the framework of the EU-funded e-Photon/ONe + network of excellence, Virtual Department on Optical Switching. Technology aspects of photonics in switching and, in particular, recent ad...... advances in wavelength conversion, ring resonators, and packet switching and processing subsystems are presented as the building blocks for the implementation of a high-performance router for the next-generation Internet....

  6. Functional Analysis for Double Shell Tank (DST) Subsystems

    Energy Technology Data Exchange (ETDEWEB)

    SMITH, D.F.

    2000-08-22

    This functional analysis identifies the hierarchy and describes the subsystem functions that support the Double-Shell Tank (DST) System described in HNF-SD-WM-TRD-007, System Specification for the Double-Shell Tank System. Because of the uncertainty associated with the need for upgrades of the existing catch tanks supporting the Waste Feed Delivery (WFD) mission, catch tank functions are not addressed in this document. The functions identified herein are applicable to the Phase 1 WFD mission only.

  7. Attitude Control Subsystem for the Advanced Communications Technology Satellite

    Science.gov (United States)

    Hewston, Alan W.; Mitchell, Kent A.; Sawicki, Jerzy T.

    1996-01-01

    This paper provides an overview of the on-orbit operation of the Attitude Control Subsystem (ACS) for the Advanced Communications Technology Satellite (ACTS). The three ACTS control axes are defined, including the means for sensing attitude and determining the pointing errors. The desired pointing requirements for various modes of control as well as the disturbance torques that oppose the control are identified. Finally, the hardware actuators and control loops utilized to reduce the attitude error are described.

  8. Subsystem for control of isotope production with linear electron accelerator

    CERN Document Server

    Karasyov, S P; Uvarov, V L

    2001-01-01

    In this report the high-current LINAC subsystem for diagnostic and monitoring the basic technological parameters of isotope production (energy flux of Bremsstrahlung photons and absorbed doze in the target,target activity, temperature and consumption of water cooling the converter and target) is described.T he parallel printer port (LPT) of the personal computer is proposed to use as an interface with the measurement channels.

  9. Laser and Optical Subsystem for NASA's Cold Atom Laboratory

    Science.gov (United States)

    Kohel, James; Kellogg, James; Elliott, Ethan; Krutzik, Markus; Aveline, David; Thompson, Robert

    2016-05-01

    We describe the design and validation of the laser and optics subsystem for NASA's Cold Atom Laboratory (CAL), a multi-user facility being developed at NASA's Jet Propulsion Laboratory for studies of ultra-cold quantum gases in the microgravity environment of the International Space Station. Ultra-cold atoms will be generated in CAL by employing a combination of laser cooling techniques and evaporative cooling in a microchip-based magnetic trap. Laser cooling and absorption imaging detection of bosonic mixtures of 87 Rb and 39 K or 41 K will be accomplished using a high-power (up to 500 mW ex-fiber), frequency-agile dual wavelength (767 nm and 780 nm) laser and optical subsystem. The CAL laser and optical subsystem also includes the capability to generate high-power multi-frequency optical pulses at 784.87 nm to realize a dual-species Bragg atom interferometer. Currently at Humboldt-Universität zu Berlin.

  10. Free-running InGaAs single photon detector with 1 cps dark count rate at 10% efficiency

    CERN Document Server

    Korzh, Boris; Lunghi, Tommaso; Gisin, Nicolas; Zbinden, Hugo

    2013-01-01

    We present a free-running single photon detector for telecom wavelengths based on a negative feedback avalanche photodiode (NFAD). A dark count rate as low as 1 cps was obtained at a detection efficiency of 10%, with an afterpulse probability of 2.2% for 20 {\\mu}s of deadtime. This was achieved by using an active hold-off circuit and cooling the NFAD with a free-piston stirling cooler down to temperatures of -110${^o}$C. We integrated two detectors into a practical, 625 MHz clocked quantum key distribution system. Stable, real-time key distribution in presence of 30 dB channel loss was possible, yielding a secret key rate of 350 bps.

  11. Medical Health Monitoring System of the CPS%医疗健康CPS监测系统研究

    Institute of Scientific and Technical Information of China (English)

    王平

    2012-01-01

    Cyber-Physical System (CPS)是基于互联网络和智能嵌入式系统等技术的智能物理信息系统.对CPS的概念和特性进行了分析,构建了一种CPS体系结构框架,该体系结构分为4层,分别为物理节点层、网络通信层、资源服务层和用户应用层,对体系结构的分层进行了讨论,设计了分层结构的CPS医疗健康系统,为进一步研究提供了借鉴.

  12. CPS网络体系结构及关键技术%CPS: Network System Framework and Key Technologies

    Institute of Scientific and Technical Information of China (English)

    胡雅菲; 李方敏; 刘新华

    2010-01-01

    Cyber-physical system(CPS)是未来通信网络的一种重要发展方向.CPS是集合物理、生物及工程学的综合性系统,具有局部操控、全局控制的特点.这种新兴的网络系统引起了研究界极大的兴趣.在简要介绍CPS网络与其应用领域的基础之上,分析了CPS系统在网络体系结构的研究热点问题.最后进行总结并展望了CPS网络未来的研究方向与重点.

  13. Predicting Speech Intelligibility with a Multiple Speech Subsystems Approach in Children with Cerebral Palsy

    Science.gov (United States)

    Lee, Jimin; Hustad, Katherine C.; Weismer, Gary

    2014-01-01

    Purpose: Speech acoustic characteristics of children with cerebral palsy (CP) were examined with a multiple speech subsystems approach; speech intelligibility was evaluated using a prediction model in which acoustic measures were selected to represent three speech subsystems. Method: Nine acoustic variables reflecting different subsystems, and…

  14. Predicting Speech Intelligibility with a Multiple Speech Subsystems Approach in Children with Cerebral Palsy

    Science.gov (United States)

    Lee, Jimin; Hustad, Katherine C.; Weismer, Gary

    2014-01-01

    Purpose: Speech acoustic characteristics of children with cerebral palsy (CP) were examined with a multiple speech subsystems approach; speech intelligibility was evaluated using a prediction model in which acoustic measures were selected to represent three speech subsystems. Method: Nine acoustic variables reflecting different subsystems, and…

  15. Influence of data preprocessing on the quantitative determination of nutrient content in poultry manure by near infrared spectroscopy.

    Science.gov (United States)

    Chen, L J; Xing, L; Han, L J

    2010-01-01

    With increasing concern over potential polltion from farm wastes, there is a need for rapid and robust methods that can analyze livestock manure nutrient content. The near infrared spectroscopy (NIRS) method was used to determine nutrient content in diverse poultry manure samples (n=91). Various standard preprocessing methods (derivatives, multiplicative scatter correction, Savitsky-Golay smoothing, and standard normal variate) were applied to reduce data systemic noise. In addition, a new preprocessing method known as direct orthogonal signal correction (DOSC) was tested. Calibration models for ammonium nitrogen, total potassium, total nitrogen, and total phosphorus were developed with the partial least squares (PLS) method. The results showed that all the preprocessed data improved prediction results compared with the non-preprocessing method. Compared with the other preprocessing methods, the DOSC method gave the best results. The DOSC method achieved moderately successful prediction for ammonium nitrogen, total nitrogen, and total phosphorus. However, all preprocessing methods did not provide reliable prediction for total potassium. This indicates the DOSC method, especially combined with other preprocessing methods, needs further study to allow a more complete predictive analysis of manure nutrient content.

  16. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines.

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J; Raboso, Mariano

    2015-06-17

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation-based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking-to reduce the dimensions of images-and binarization-to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  17. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Lara del Val

    2015-06-01

    Full Text Available Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM. The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements.

  18. PRE-Processing for Video Coduing with Rate-Distortion Optimization Decision

    Institute of Scientific and Technical Information of China (English)

    QI Yi; HUANG Yong-gui; QI Hong-gang

    2006-01-01

    This paper proposes an adaptive video pre-processing algorithm for video coding. This algorithm works on the original image before intra- or inter-prediction. It adopts Gaussian filter to remove noise and insignificant features existing in images of video. Detection and restoration of edges are followed to restore the edges which are excessively filtered out in filtered images. Rate-Distortion Optimization (RDO) is employed to decide adaptively whether a processed block or a unprocessed block is coded into bit-streams doe more efficient coding. Our experiment results show that the algorithm achieves good coding performances on both subjective and objective aspects. In addition, the proposed pre-processing algorithm is transparent to decoder, and thus can be compliant with any video coding standards without modifying the decoder.

  19. PREPROCESSING PADA SEGMENTASI CITRA PARU-PARU DAN JANTUNG MENGGUNAKAN ANISOTROPIC DIFFUSION FILTER

    Directory of Open Access Journals (Sweden)

    A. T. A Prawira Kusuma

    2015-12-01

    Full Text Available This paper propose a preprocessing techniques in lung segmentation scheme using Anisotropic Diffusion filters. The aim is to improve the accuracy, sensitivity and specificity results of segmentation. This method was chosen because it has the ability to detect the edge, namely in doing smoothing, this method can obscure noise, while maintaining the edges of objects in the image. Characteristics such as this is needed to process medical image filter, where the boundary between the organ and the background is not so clear. The segmentation process is done by K-means Clustering and Active Contour to segment the lungs. Segmentation results were validated using the Receiver Operating Characteristic (ROC showed an increased accuracy, sensitivity and specificity, when compared with the results of segmentation in the previous paper, in which the preprocessing method used is Gaussian Lowpass filter.

  20. A Study on Pre-processing Algorithms for Metal Parts Inspection

    Directory of Open Access Journals (Sweden)

    Haider Sh. Hashim

    2011-06-01

    Full Text Available Pre-processing is very useful in a variety of situations since it helps to suppress information that is not related to the exact image processing or analysis task. Mathematical morphology is used for analysis, understanding and image processing. It is an influential method in the geometric morphological analysis and image understanding. It has befallen a new theory in the digital image processing domain. Edges detection and noise reduction are a crucial and very important pre-processing step. The classical edge detection methods and filtering are less accurate in detecting complex edge and filtering various types of noise. This paper proposed some useful mathematic morphological techniques to detect edge and to filter noise in metal parts image. The experimental result showed that the proposed algorithm helps to increase accuracy of metal parts inspection system.

  1. Analog preprocessing in a SNS 2 micrometers low-noise CMOS folding ADC

    Science.gov (United States)

    Carr, Richard D.

    1994-12-01

    Significant research in high performance analog-to-digital converters (ADC's) has been directed at retaining part of the high-speed flash ADC architecture, while reducing the total number of comparators in the circuit. The symmetrical number system (SNS) can be used to preprocess the analog input signal, reducing the number of comparators and thus reducing the chip area and power consumption of the ADC. This thesis examines a Very Large Scale Integrated (VLSI) design for a folding circuit for a SNS analog preprocessing architecture in a 9-bit folding ADC with a total of 23 comparators. The analog folding circuit layout uses the Orbit 2 micrometers CMOS N-well double-metal, double-poly low-noise analog process. The effects of Spice level 2 parameter tolerances during fabrication on the operation of the folding circuit are investigated numerically. The frequency response of the circuit is also quantified. An Application Specific Integrated Circuit (ASIC) is designed.

  2. Radar signal pre-processing to suppress surface bounce and multipath

    Science.gov (United States)

    Paglieroni, David W; Mast, Jeffrey E; Beer, N. Reginald

    2013-12-31

    A method and system for detecting the presence of subsurface objects within a medium is provided. In some embodiments, the imaging and detection system operates in a multistatic mode to collect radar return signals generated by an array of transceiver antenna pairs that is positioned across the surface and that travels down the surface. The imaging and detection system pre-processes that return signal to suppress certain undesirable effects. The imaging and detection system then generates synthetic aperture radar images from real aperture radar images generated from the pre-processed return signal. The imaging and detection system then post-processes the synthetic aperture radar images to improve detection of subsurface objects. The imaging and detection system identifies peaks in the energy levels of the post-processed image frame, which indicates the presence of a subsurface object.

  3. Preprocessing, classification modeling and feature selection using flow injection electrospray mass spectrometry metabolite fingerprint data.

    Science.gov (United States)

    Enot, David P; Lin, Wanchang; Beckmann, Manfred; Parker, David; Overy, David P; Draper, John

    2008-01-01

    Metabolome analysis by flow injection electrospray mass spectrometry (FIE-MS) fingerprinting generates measurements relating to large numbers of m/z signals. Such data sets often exhibit high variance with a paucity of replicates, thus providing a challenge for data mining. We describe data preprocessing and modeling methods that have proved reliable in projects involving samples from a range of organisms. The protocols interact with software resources specifically for metabolomics provided in a Web-accessible data analysis package FIEmspro (http://users.aber.ac.uk/jhd) written in the R environment and requiring a moderate knowledge of R command-line usage. Specific emphasis is placed on describing the outcome of modeling experiments using FIE-MS data that require further preprocessing to improve quality. The salient features of both poor and robust (i.e., highly generalizable) multivariate models are outlined together with advice on validating classifiers and avoiding false discovery when seeking explanatory variables.

  4. Acoustic Biometric System Based on Preprocessing Techniques and Linear Support Vector Machines

    Science.gov (United States)

    del Val, Lara; Izquierdo-Fuente, Alberto; Villacorta, Juan J.; Raboso, Mariano

    2015-01-01

    Drawing on the results of an acoustic biometric system based on a MSE classifier, a new biometric system has been implemented. This new system preprocesses acoustic images, extracts several parameters and finally classifies them, based on Support Vector Machine (SVM). The preprocessing techniques used are spatial filtering, segmentation—based on a Gaussian Mixture Model (GMM) to separate the person from the background, masking—to reduce the dimensions of images—and binarization—to reduce the size of each image. An analysis of classification error and a study of the sensitivity of the error versus the computational burden of each implemented algorithm are presented. This allows the selection of the most relevant algorithms, according to the benefits required by the system. A significant improvement of the biometric system has been achieved by reducing the classification error, the computational burden and the storage requirements. PMID:26091392

  5. A Hybrid System based on Multi-Agent System in the Data Preprocessing Stage

    CERN Document Server

    Kularbphettong, Kobkul; Meesad, Phayung

    2010-01-01

    We describe the usage of the Multi-agent system in the data preprocessing stage of an on-going project, called e-Wedding. The aim of this project is to utilize MAS and various approaches, like Web services, Ontology, and Data mining techniques, in e-Business that want to improve responsiveness and efficiency of systems so as to extract customer behavior model on Wedding Businesses. However, in this paper, we propose and implement the multi-agent-system, based on JADE, to only cope data preprocessing stage specified on handle with missing value techniques. JADE is quite easy to learn and use. Moreover, it supports many agent approaches such as agent communication, protocol, behavior and ontology. This framework has been experimented and evaluated in the realization of a simple, but realistic. The results, though still preliminary, are quite.

  6. Input data preprocessing method for exchange rate forecasting via neural network

    Directory of Open Access Journals (Sweden)

    Antić Dragan S.

    2014-01-01

    Full Text Available The aim of this paper is to present a method for neural network input parameters selection and preprocessing. The purpose of this network is to forecast foreign exchange rates using artificial intelligence. Two data sets are formed for two different economic systems. Each system is represented by six categories with 70 economic parameters which are used in the analysis. Reduction of these parameters within each category was performed by using the principal component analysis method. Component interdependencies are established and relations between them are formed. Newly formed relations were used to create input vectors of a neural network. The multilayer feed forward neural network is formed and trained using batch training. Finally, simulation results are presented and it is concluded that input data preparation method is an effective way for preprocessing neural network data. [Projekat Ministarstva nauke Republike Srbije, br.TR 35005, br. III 43007 i br. III 44006

  7. The Role of GRAIL Orbit Determination in Preprocessing of Gravity Science Measurements

    Science.gov (United States)

    Kruizinga, Gerhard; Asmar, Sami; Fahnestock, Eugene; Harvey, Nate; Kahan, Daniel; Konopliv, Alex; Oudrhiri, Kamal; Paik, Meegyeong; Park, Ryan; Strekalov, Dmitry; Watkins, Michael; Yuan, Dah-Ning

    2013-01-01

    The Gravity Recovery And Interior Laboratory (GRAIL) mission has constructed a lunar gravity field with unprecedented uniform accuracy on the farside and nearside of the Moon. GRAIL lunar gravity field determination begins with preprocessing of the gravity science measurements by applying corrections for time tag error, general relativity, measurement noise and biases. Gravity field determination requires the generation of spacecraft ephemerides of an accuracy not attainable with the pre-GRAIL lunar gravity fields. Therefore, a bootstrapping strategy was developed, iterating between science data preprocessing and lunar gravity field estimation in order to construct sufficiently accurate orbit ephemerides.This paper describes the GRAIL measurements, their dependence on the spacecraft ephemerides and the role of orbit determination in the bootstrapping strategy. Simulation results will be presented that validate the bootstrapping strategy followed by bootstrapping results for flight data, which have led to the latest GRAIL lunar gravity fields.

  8. The impact of data preprocessing in traumatic brain injury detection using functional magnetic resonance imaging.

    Science.gov (United States)

    Vergara, Victor M; Damaraju, Eswar; Mayer, Andrew B; Miller, Robyn; Cetin, Mustafa S; Calhoun, Vince

    2015-01-01

    Traumatic brain injury (TBI) can adversely affect a person's thinking, memory, personality and behavior. For this reason new and better biomarkers are being investigated. Resting state functional network connectivity (rsFNC) derived from functional magnetic resonance (fMRI) imaging is emerging as a possible biomarker. One of the main concerns with this technique is the appropriateness of methods used to correct for subject movement. In this work we used 50 mild TBI patients and matched healthy controls to explore the outcomes obtained from different fMRI data preprocessing. Results suggest that correction for motion variance before spatial smoothing is the best alternative. Following this preprocessing option a significant group difference was found between cerebellum and supplementary motor area/paracentral lobule. In this case the mTBI group exhibits an increase in rsFNC.

  9. KONFIG and REKONFIG: Two interactive preprocessing to the Navy/NASA Engine Program (NNEP)

    Science.gov (United States)

    Fishbach, L. H.

    1981-01-01

    The NNEP is a computer program that is currently being used to simulate the thermodynamic cycle performance of almost all types of turbine engines by many government, industry, and university personnel. The NNEP uses arrays of input data to set up the engine simulation and component matching method as well as to describe the characteristics of the components. A preprocessing program (KONFIG) is described in which the user at a terminal on a time shared computer can interactively prepare the arrays of data required. It is intended to make it easier for the occasional or new user to operate NNEP. Another preprocessing program (REKONFIG) in which the user can modify the component specifications of a previously configured NNEP dataset is also described. It is intended to aid in preparing data for parametric studies and/or studies of similar engines such a mixed flow turbofans, turboshafts, etc.

  10. Effective automated prediction of vertebral column pathologies based on logistic model tree with SMOTE preprocessing.

    Science.gov (United States)

    Karabulut, Esra Mahsereci; Ibrikci, Turgay

    2014-05-01

    This study develops a logistic model tree based automation system based on for accurate recognition of types of vertebral column pathologies. Six biomechanical measures are used for this purpose: pelvic incidence, pelvic tilt, lumbar lordosis angle, sacral slope, pelvic radius and grade of spondylolisthesis. A two-phase classification model is employed in which the first step is preprocessing the data by use of Synthetic Minority Over-sampling Technique (SMOTE), and the second one is feeding the classifier Logistic Model Tree (LMT) with the preprocessed data. We have achieved an accuracy of 89.73 %, and 0.964 Area Under Curve (AUC) in computer based automatic detection of the pathology. This was validated via a 10-fold-cross-validation experiment conducted on clinical records of 310 patients. The study also presents a comparative analysis of the vertebral column data with the use of several machine learning algorithms.

  11. The Combined Effect of Filters in ECG Signals for Pre-Processing

    Directory of Open Access Journals (Sweden)

    Isha V. Upganlawar

    2014-05-01

    Full Text Available The ECG signal is abruptly changing and continuous in nature. The heart disease such as paroxysmal of heart, arrhythmia diagnosing, are related with the intelligent health care decision this ECG signal need to be pre-process accurately for further action on it such as extracting the features, wavelet decomposition, distribution of QRS complexes in ECG recordings and related information such as heart rate and RR interval, classification of the signal by using various classifiers etc. Filters plays very important role in analyzing the low frequency components in ECG signal. The biomedical signals are of low frequency, the removal of power line interference and baseline wander is a very important step at the pre-processing stage of ECG. In these paper we deal with the study of Median filtering and FIR (Finite Impulse Responsefiltering of ECG signals under noisy condition

  12. The Combined Effect of Filters in ECG Signals for Pre-Processing

    OpenAIRE

    Isha V. Upganlawar; Harshal Chowhan

    2014-01-01

    The ECG signal is abruptly changing and continuous in nature. The heart disease such as paroxysmal of heart, arrhythmia diagnosing, are related with the intelligent health care decision this ECG signal need to be pre-process accurately for further action on it such as extracting the features, wavelet decomposition, distribution of QRS complexes in ECG recordings and related information such as heart rate and RR interval, classification of the signal by using various classifiers etc. Filters p...

  13. Data preprocessing for a vehicle-based localization system used in road traffic applications

    Science.gov (United States)

    Patelczyk, Timo; Löffler, Andreas; Biebl, Erwin

    2016-09-01

    This paper presents a fixed-point implementation of the preprocessing using a field programmable gate array (FPGA), which is required for a multipath joint angle and delay estimation (JADE) used in road traffic applications. This paper lays the foundation for many model-based parameter estimation methods. Here, a simulation of a vehicle-based localization system application for protecting vulnerable road users, which were equipped with appropriate transponders, is considered. For such safety critical applications, the robustness and real-time capability of the localization is particularly important. Additionally, a motivation to use a fixed-point implementation for the data preprocessing is a limited computing power of the head unit of a vehicle. This study aims to process the raw data provided by the localization system used in this paper. The data preprocessing applied includes a wideband calibration of the physical localization system, separation of relevant information from the received sampled signal, and preparation of the incoming data via further processing. Further, a channel matrix estimation was implemented to complete the data preprocessing, which contains information on channel parameters, e.g., the positions of the objects to be located. In the presented case of a vehicle-based localization system application we assume an urban environment, in which multipath propagation occurs. Since most methods for localization are based on uncorrelated signals, this fact must be addressed. Hence, a decorrelation of incoming data stream in terms of a further localization is required. This decorrelation was accomplished by considering several snapshots in different time slots. As a final aspect of the use of fixed-point arithmetic, quantization errors are considered. In addition, the resources and runtime of the presented implementation are discussed; these factors are strongly linked to a practical implementation.

  14. A clinical evaluation of the RNCA study using Fourier filtering as a preprocessing method

    Energy Technology Data Exchange (ETDEWEB)

    Robeson, W.; Alcan, K.E.; Graham, M.C.; Palestro, C.; Oliver, F.H.; Benua, R.S.

    1984-06-01

    Forty-one patients (25 male, 16 female) were studied by Radionuclide Cardangiography (RNCA) in our institution. There were 42 rest studies and 24 stress studies (66 studies total). Sixteen patients were normal, 15 had ASHD, seven had a cardiomyopathy, and three had left-sided valvular regurgitation. Each study was preprocessed using both the standard nine-point smoothing method and Fourier filtering. Amplitude and phase images were also generated. Both preprocessing methods were compared with respect to image quality, border definition, reliability and reproducibility of the LVEF, and cine wall motion interpretation. Image quality and border definition were judged superior by the consensus of two independent observers in 65 of 66 studies (98%) using Fourier filtered data. The LVEF differed between the two processes by greater than .05 in 17 of 66 studies (26%) including five studies in which the LVEF could not be determined using nine-point smoothed data. LV wall motion was normal by both techniques in all control patients by cine analysis. However, cine wall motion analysis using Fourier filtered data demonstrated additional abnormalities in 17 of 25 studies (68%) in the ASHD group, including three uninterpretable studies using nine-point smoothed data. In the cardiomyopathy/valvular heart disease group, ten of 18 studies (56%) had additional wall motion abnormalities using Fourier filtered data (including four uninterpretable studies using nine-point smoothed data). We conclude that Fourier filtering is superior to the nine-point smooth preprocessing method now in general use in terms of image quality, border definition, generation of an LVEF, and cine wall motion analysis. The advent of the array processor makes routine preprocessing by Fourier filtering a feasible technologic advance in the development of the RNCA study.

  15. Pre-Processing and Re-Weighting Jet Images with Different Substructure Variables

    CERN Document Server

    Huynh, Lynn

    2016-01-01

    This work is an extension of Monte Carlo simulation based studies in tagging boosted, hadronically decaying W bosons at a center of mass energy of s = 13 TeV. Two pre-processing techniques used with jet images, translation and rotation, are first examined. The generated jet images for W signal jets and QCD background jets are then rescaled and weighted with five different substructure variables for visual comparison.

  16. Preprocessing techniques to reduce atmospheric and sensor variability in multispectral scanner data.

    Science.gov (United States)

    Crane, R. B.

    1971-01-01

    Multispectral scanner data are potentially useful in a variety of remote sensing applications. Large-area surveys of earth resources carried out by automated recognition processing of these data are particularly important. However, the practical realization of such surveys is limited by a variability in the scanner signals that results in improper recognition of the data. This paper discusses ways by which some of this variability can be removed from the data by preprocessing with resultant improvements in recognition results.

  17. Performance evaluation of preprocessing techniques utilizing expert information in multivariate calibration.

    Science.gov (United States)

    Sharma, Sandeep; Goodarzi, Mohammad; Ramon, Herman; Saeys, Wouter

    2014-04-01

    Partial Least Squares (PLS) regression is one of the most used methods for extracting chemical information from Near Infrared (NIR) spectroscopic measurements. The success of a PLS calibration relies largely on the representativeness of the calibration data set. This is not trivial, because not only the expected variation in the analyte of interest, but also the variation of other contributing factors (interferents) should be included in the calibration data. This also implies that changes in interferent concentrations not covered in the calibration step can deteriorate the prediction ability of the calibration model. Several researchers have suggested that PLS models can be robustified against changes in the interferent structure by incorporating expert knowledge in the preprocessing step with the aim to efficiently filter out the spectral influence of the spectral interferents. However, these methods have not yet been compared against each other. Therefore, in the present study, various preprocessing techniques exploiting expert knowledge were compared on two experimental data sets. In both data sets, the calibration and test set were designed to have a different interferent concentration range. The performance of these techniques was compared to that of preprocessing techniques which do not use any expert knowledge. Using expert knowledge was found to improve the prediction performance for both data sets. For data set-1, the prediction error improved nearly 32% when pure component spectra of the analyte and the interferents were used in the Extended Multiplicative Signal Correction framework. Similarly, for data set-2, nearly 63% improvement in the prediction error was observed when the interferent information was utilized in Spectral Interferent Subtraction preprocessing.

  18. Pre-Processing Noise Cross-Correlations with Equalizing the Network Covariance Matrix Eigen-Spectrum

    Science.gov (United States)

    Seydoux, L.; de Rosny, J.; Shapiro, N.

    2016-12-01

    Theoretically, the extraction of Green functions from noise cross-correlation requires the ambient seismic wavefield to be generated by uncorrelated sources evenly distributed in the medium. Yet, this condition is often not verified. Strong events such as earthquakes often produce highly coherent transient signals. Also, the microseismic noise is generated at specific places on the Earth's surface with source regions often very localized in space. Different localized and persistent seismic sources may contaminate the cross-correlations of continuous records resulting in spurious arrivals or asymmetry and, finally, in biased travel-time measurements. Pre-processing techniques therefore must be applied to the seismic data in order to reduce the effect of noise anisotropy and the influence of strong localized events. Here we describe a pre-processing approach that uses the covariance matrix computed from signals recorded by a network of seismographs. We extend the widely used time and spectral equalization pre-processing to the equalization of the covariance matrix spectrum (i.e., its ordered eigenvalues). This approach can be considered as a spatial equalization. This method allows us to correct for the wavefield anisotropy in two ways: (1) the influence of strong directive sources is substantially attenuated, and (2) the weakly excited modes are reinforced, allowing to partially recover the conditions required for the Green's function retrieval. We also present an eigenvector-based spatial filter used to distinguish between surface and body waves. This last filter is used together with the equalization of the eigenvalue spectrum. We simulate two-dimensional wavefield in a heterogeneous medium with strongly dominating source. We show that our method greatly improves the travel-time measurements obtained from the inter-station cross-correlation functions. Also, we apply the developed method to the USArray data and pre-process the continuous records strongly influenced

  19. Hyperspectral imaging in medicine: image pre-processing problems and solutions in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2015-11-01

    The paper presents problems and solutions related to hyperspectral image pre-processing. New methods of preliminary image analysis are proposed. The paper shows problems occurring in Matlab when trying to analyse this type of images. Moreover, new methods are discussed which provide the source code in Matlab that can be used in practice without any licensing restrictions. The proposed application and sample result of hyperspectral image analysis.

  20. Review of Data Preprocessing Methods for Sign Language Recognition Systems based on Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Zorins Aleksejs

    2016-12-01

    Full Text Available The article presents an introductory analysis of relevant research topic for Latvian deaf society, which is the development of the Latvian Sign Language Recognition System. More specifically the data preprocessing methods are discussed in the paper and several approaches are shown with a focus on systems based on artificial neural networks, which are one of the most successful solutions for sign language recognition task.

  1. Design and Implementation of the Security Subsystem in Payroll Management System

    Institute of Scientific and Technical Information of China (English)

    Zeyad M.Ali Alfawaer; GuiWeihua; Chen Xiaofang

    2002-01-01

    This paper designs a security subsystem based on password protection and triggers mechanism in database management systems. Structure and function of the security subsystem is described in a payroll management system as an example of how the security subsystem gives additional security from possible intended or accidental attacks to the entire database system. The security subsystem involves password protection, triggers and improved physical storage mechanism, which combine together to be integrity to enhance security performance.Implementation of the security subsystem in a payroll database has proved its effectiveness.

  2. Impulsive Controllability/Observability for Interconnected Descriptor Systems with Two Subsystems

    Directory of Open Access Journals (Sweden)

    Qingling Zhang

    2015-01-01

    Full Text Available The problem of decentralized impulse controllability/observability for large-scale interconnected descriptor systems with two subsystems by derivative feedback is studied. Necessary conditions for the existence of a derivative feedback controller for the first subsystem of the large-scale interconnected descriptor systems ensuring the second subsystem to be impulse controllable and impulse observable are derived, respectively. Based on the results, a derivative feedback controller for the first subsystem of the large-scale interconnected descriptor systems is constructed easily such that the second subsystem is impulse controllable or impulse observable. Finally, examples are given to illustrate the effectiveness of the results obtained in this paper.

  3. National Ignition Facility, subsystem design requirements beam control {ampersand} laser diagnostics SSDR 1.7

    Energy Technology Data Exchange (ETDEWEB)

    Bliss, E.

    1996-11-01

    This Subsystem Design Requirement document is a development specification that establishes the performance, design, development, and test requirements for the Alignment subsystem (WBS 1.7.1), Beam Diagnostics (WBS 1.7.2), and the Wavefront Control subsystem (WBS 1.7. 3) of the NIF Laser System (WBS 1.3). These three subsystems are collectively referred to as the Beam Control & Laser Diagnostics Subsystem. The NIF is a multi-pass, 192-beam, high-power, neodymium-glass laser that meets requirements set forth in the NIF SDR 002 (Laser System). 3 figs., 3 tabs.

  4. Systems and methods for an integrated electrical sub-system powered by wind energy

    Science.gov (United States)

    Liu, Yan; Garces, Luis Jose

    2008-06-24

    Various embodiments relate to systems and methods related to an integrated electrically-powered sub-system and wind power system including a wind power source, an electrically-powered sub-system coupled to and at least partially powered by the wind power source, the electrically-powered sub-system being coupled to the wind power source through power converters, and a supervisory controller coupled to the wind power source and the electrically-powered sub-system to monitor and manage the integrated electrically-powered sub-system and wind power system.

  5. Autonomous navigation - The ARMMS concept. [Autonomous Redundancy and Maintenance Management Subsystem

    Science.gov (United States)

    Wood, L. J.; Jones, J. B.; Mease, K. D.; Kwok, J. H.; Goltz, G. L.; Kechichian, J. A.

    1984-01-01

    A conceptual design is outlined for the navigation subsystem of the Autonomous Redundancy and Maintenance Management Subsystem (ARMMS). The principal function of this navigation subsystem is to maintain the spacecraft over a specified equatorial longitude to within + or - 3 deg. In addition, the navigation subsystem must detect and correct internal faults. It comprises elements for a navigation executive and for orbit determination, trajectory, maneuver planning, and maneuver command. Each of these elements is described. The navigation subsystem is to be used in the DSCS III spacecraft.

  6. Data Cleaning In Data Warehouse: A Survey of Data Pre-processing Techniques and Tools

    Directory of Open Access Journals (Sweden)

    Anosh Fatima

    2017-03-01

    Full Text Available A Data Warehouse is a computer system designed for storing and analyzing an organization's historical data from day-to-day operations in Online Transaction Processing System (OLTP. Usually, an organization summarizes and copies information from its operational systems to the data warehouse on a regular schedule and management performs complex queries and analysis on the information without slowing down the operational systems. Data need to be pre-processed to improve quality of data, before storing into data warehouse. This survey paper presents data cleaning problems and the approaches in use currently for preprocessing. To determine which technique of preprocessing is best in what scenario to improve the performance of Data Warehouse is main goal of this paper. Many techniques have been analyzed for data cleansing, using certain evaluation attributes and tested on different kind of data sets. Data quality tools such as YALE, ALTERYX, and WEKA have been used for conclusive results to ready the data in data warehouse and ensure that only cleaned data populates the warehouse, thus enhancing usability of the warehouse. Results of paper can be useful in many future activities like cleansing, standardizing, correction, matching and transformation. This research can help in data auditing and pattern detection in the data.

  7. Supervised pre-processing approaches in multiple class variables classification for fish recruitment forecasting

    KAUST Repository

    Fernandes, José Antonio

    2013-02-01

    A multi-species approach to fisheries management requires taking into account the interactions between species in order to improve recruitment forecasting of the fish species. Recent advances in Bayesian networks direct the learning of models with several interrelated variables to be forecasted simultaneously. These models are known as multi-dimensional Bayesian network classifiers (MDBNs). Pre-processing steps are critical for the posterior learning of the model in these kinds of domains. Therefore, in the present study, a set of \\'state-of-the-art\\' uni-dimensional pre-processing methods, within the categories of missing data imputation, feature discretization and feature subset selection, are adapted to be used with MDBNs. A framework that includes the proposed multi-dimensional supervised pre-processing methods, coupled with a MDBN classifier, is tested with synthetic datasets and the real domain of fish recruitment forecasting. The correctly forecasting of three fish species (anchovy, sardine and hake) simultaneously is doubled (from 17.3% to 29.5%) using the multi-dimensional approach in comparison to mono-species models. The probability assessments also show high improvement reducing the average error (estimated by means of Brier score) from 0.35 to 0.27. Finally, these differences are superior to the forecasting of species by pairs. © 2012 Elsevier Ltd.

  8. Super-resolution algorithm based on sparse representation and wavelet preprocessing for remote sensing imagery

    Science.gov (United States)

    Ren, Ruizhi; Gu, Lingjia; Fu, Haoyang; Sun, Chenglin

    2017-04-01

    An effective super-resolution (SR) algorithm is proposed for actual spectral remote sensing images based on sparse representation and wavelet preprocessing. The proposed SR algorithm mainly consists of dictionary training and image reconstruction. Wavelet preprocessing is used to establish four subbands, i.e., low frequency, horizontal, vertical, and diagonal high frequency, for an input image. As compared to the traditional approaches involving the direct training of image patches, the proposed approach focuses on the training of features derived from these four subbands. The proposed algorithm is verified using different spectral remote sensing images, e.g., moderate-resolution imaging spectroradiometer (MODIS) images with different bands, and the latest Chinese Jilin-1 satellite images with high spatial resolution. According to the visual experimental results obtained from the MODIS remote sensing data, the SR images using the proposed SR algorithm are superior to those using a conventional bicubic interpolation algorithm or traditional SR algorithms without preprocessing. Fusion algorithms, e.g., standard intensity-hue-saturation, principal component analysis, wavelet transform, and the proposed SR algorithms are utilized to merge the multispectral and panchromatic images acquired by the Jilin-1 satellite. The effectiveness of the proposed SR algorithm is assessed by parameters such as peak signal-to-noise ratio, structural similarity index, correlation coefficient, root-mean-square error, relative dimensionless global error in synthesis, relative average spectral error, spectral angle mapper, and the quality index Q4, and its performance is better than that of the standard image fusion algorithms.

  9. Desktop Software for Patch-Clamp Raw Binary Data Conversion and Preprocessing

    Directory of Open Access Journals (Sweden)

    Ning Zhang

    2011-01-01

    Full Text Available Since raw data recorded by patch-clamp systems are always stored in binary format, electrophysiologists may experience difficulties with patch clamp data preprocessing especially when they want to analyze by custom-designed algorithms. In this study, we present desktop software, called PCDReader, which could be an effective and convenient solution for patch clamp data preprocessing for daily laboratory use. We designed a novel class module, called clsPulseData, to directly read the raw data along with the parameters recorded from HEKA instruments without any other program support. By a graphical user interface, raw binary data files can be converted into several kinds of ASCII text files for further analysis, with several preprocessing options. And the parameters can also be viewed, modified and exported into ASCII files by a user-friendly Explorer style window. The real-time data loading technique and optimized memory management programming makes PCDReader a fast and efficient tool. The compiled software along with the source code of the clsPulseData class module is freely available to academic and nonprofit users.

  10. Learning-based image preprocessing for robust computer-aided detection

    Science.gov (United States)

    Raghupathi, Laks; Devarakota, Pandu R.; Wolf, Matthias

    2013-03-01

    Recent studies have shown that low dose computed tomography (LDCT) can be an effective screening tool to reduce lung cancer mortality. Computer-aided detection (CAD) would be a beneficial second reader for radiologists in such cases. Studies demonstrate that while iterative reconstructions (IR) improve LDCT diagnostic quality, it however degrades CAD performance significantly (increased false positives) when applied directly. For improving CAD performance, solutions such as retraining with newer data or applying a standard preprocessing technique may not be suffice due to high prevalence of CT scanners and non-uniform acquisition protocols. Here, we present a learning-based framework that can adaptively transform a wide variety of input data to boost an existing CAD performance. This not only enhances their robustness but also their applicability in clinical workflows. Our solution consists of applying a suitable pre-processing filter automatically on the given image based on its characteristics. This requires the preparation of ground truth (GT) of choosing an appropriate filter resulting in improved CAD performance. Accordingly, we propose an efficient consolidation process with a novel metric. Using key anatomical landmarks, we then derive consistent feature descriptors for the classification scheme that then uses a priority mechanism to automatically choose an optimal preprocessing filter. We demonstrate CAD prototype∗ performance improvement using hospital-scale datasets acquired from North America, Europe and Asia. Though we demonstrated our results for a lung nodule CAD, this scheme is straightforward to extend to other post-processing tools dedicated to other organs and modalities.

  11. Data pre-processing for web log mining: Case study of commercial bank website usage analysis

    Directory of Open Access Journals (Sweden)

    Jozef Kapusta

    2013-01-01

    Full Text Available We use data cleaning, integration, reduction and data conversion methods in the pre-processing level of data analysis. Data processing techniques improve the overall quality of the patterns mined. The paper describes using of standard pre-processing methods for preparing data of the commercial bank website in the form of the log file obtained from the web server. Data cleaning, as the simplest step of data pre-processing, is non–trivial as the analysed content is highly specific. We had to deal with the problem of frequent changes of the content and even frequent changes of the structure. Regular changes in the structure make use of the sitemap impossible. We presented approaches how to deal with this problem. We were able to create the sitemap dynamically just based on the content of the log file. In this case study, we also examined just the one part of the website over the standard analysis of an entire website, as we did not have access to all log files for the security reason. As the result, the traditional practices had to be adapted for this special case. Analysing just the small fraction of the website resulted in the short session time of regular visitors. We were not able to use recommended methods to determine the optimal value of session time. Therefore, we proposed new methods based on outliers identification for raising the accuracy of the session length in this paper.

  12. Flexibility and utility of pre-processing methods in converting STXM setups for ptychography - Final Paper

    Energy Technology Data Exchange (ETDEWEB)

    Fromm, Catherine [SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2015-08-20

    Ptychography is an advanced diffraction based imaging technique that can achieve resolution of 5nm and below. It is done by scanning a sample through a beam of focused x-rays using discrete yet overlapping scan steps. Scattering data is collected on a CCD camera, and the phase of the scattered light is reconstructed with sophisticated iterative algorithms. Because the experimental setup is similar, ptychography setups can be created by retrofitting existing STXM beam lines with new hardware. The other challenge comes in the reconstruction of the collected scattering images. Scattering data must be adjusted and packaged with experimental parameters to calibrate the reconstruction software. The necessary pre-processing of data prior to reconstruction is unique to each beamline setup, and even the optical alignments used on that particular day. Pre-processing software must be developed to be flexible and efficient in order to allow experiments appropriate control and freedom in the analysis of their hard-won data. This paper will describe the implementation of pre-processing software which successfully connects data collection steps to reconstruction steps, letting the user accomplish accurate and reliable ptychography.

  13. Evaluating the validity of spectral calibration models for quantitative analysis following signal preprocessing.

    Science.gov (United States)

    Chen, Da; Grant, Edward

    2012-11-01

    When paired with high-powered chemometric analysis, spectrometric methods offer great promise for the high-throughput analysis of complex systems. Effective classification or quantification often relies on signal preprocessing to reduce spectral interference and optimize the apparent performance of a calibration model. However, less frequently addressed by systematic research is the affect of preprocessing on the statistical accuracy of a calibration result. The present work demonstrates the effectiveness of two criteria for validating the performance of signal preprocessing in multivariate models in the important dimensions of bias and precision. To assess the extent of bias, we explore the applicability of the elliptic joint confidence region (EJCR) test and devise a new means to evaluate precision by a bias-corrected root mean square error of prediction. We show how these criteria can effectively gauge the success of signal pretreatments in suppressing spectral interference while providing a straightforward means to determine the optimal level of model complexity. This methodology offers a graphical diagnostic by which to visualize the consequences of pretreatment on complex multivariate models, enabling optimization with greater confidence. To demonstrate the application of the EJCR criterion in this context, we evaluate the validity of representative calibration models using standard pretreatment strategies on three spectral data sets. The results indicate that the proposed methodology facilitates the reliable optimization of a well-validated calibration model, thus improving the capability of spectrophotometric analysis.

  14. Characterizing the continuously acquired cardiovascular time series during hemodialysis, using median hybrid filter preprocessing noise reduction.

    Science.gov (United States)

    Wilson, Scott; Bowyer, Andrea; Harrap, Stephen B

    2015-01-01

    The clinical characterization of cardiovascular dynamics during hemodialysis (HD) has important pathophysiological implications in terms of diagnostic, cardiovascular risk assessment, and treatment efficacy perspectives. Currently the diagnosis of significant intradialytic systolic blood pressure (SBP) changes among HD patients is imprecise and opportunistic, reliant upon the presence of hypotensive symptoms in conjunction with coincident but isolated noninvasive brachial cuff blood pressure (NIBP) readings. Considering hemodynamic variables as a time series makes a continuous recording approach more desirable than intermittent measures; however, in the clinical environment, the data signal is susceptible to corruption due to both impulsive and Gaussian-type noise. Signal preprocessing is an attractive solution to this problem. Prospectively collected continuous noninvasive SBP data over the short-break intradialytic period in ten patients was preprocessed using a novel median hybrid filter (MHF) algorithm and compared with 50 time-coincident pairs of intradialytic NIBP measures from routine HD practice. The median hybrid preprocessing technique for continuously acquired cardiovascular data yielded a dynamic regression without significant noise and artifact, suitable for high-level profiling of time-dependent SBP behavior. Signal accuracy is highly comparable with standard NIBP measurement, with the added clinical benefit of dynamic real-time hemodynamic information.

  15. Foveal processing difficulty does not affect parafoveal preprocessing in young readers

    Science.gov (United States)

    Marx, Christina; Hawelka, Stefan; Schuster, Sarah; Hutzler, Florian

    2017-01-01

    Recent evidence suggested that parafoveal preprocessing develops early during reading acquisition, that is, young readers profit from valid parafoveal information and exhibit a resultant preview benefit. For young readers, however, it is unknown whether the processing demands of the currently fixated word modulate the extent to which the upcoming word is parafoveally preprocessed – as it has been postulated (for adult readers) by the foveal load hypothesis. The present study used the novel incremental boundary technique to assess whether 4th and 6th Graders exhibit an effect of foveal load. Furthermore, we attempted to distinguish the foveal load effect from the spillover effect. These effects are hard to differentiate with respect to the expected pattern of results, but are conceptually different. The foveal load effect is supposed to reflect modulations of the extent of parafoveal preprocessing, whereas the spillover effect reflects the ongoing processing of the previous word whilst the reader’s fixation is already on the next word. The findings revealed that the young readers did not exhibit an effect of foveal load, but a substantial spillover effect. The implications for previous studies with adult readers and for models of eye movement control in reading are discussed. PMID:28139718

  16. Studi Numerik Pengaruh Convergency Promoters (CPs Terhadap Karakteristik Aliran Dan Perpindahan Panas Dengan ℓ/D = 0.25, Pada Tube Banks Yang Tersusun Secara Staggered

    Directory of Open Access Journals (Sweden)

    Chairunnisa Chairunnisa

    2013-09-01

    Full Text Available Compact heat exchanger merupakan jenis alat penukar kalor yang banyak digunakan didunia industri gas, refrigerasi dan tata udara. Dalam hal performa, compact heat exchanger bergantung pada pola permukaan fin, yakni wavy dan straight fin. Straight fin, struktur permukaan fin yang datar membuat aliran membutuhkan waktu yang relatif lebih lama untuk terjadi perpindahan panas dibandingkan tipe wavy fin yang bergelombang. Selain merubah pola permukaan, upaya untuk memaksimalkan perpindahan panas pada straight fin juga dapat dibentuk dengan penambahan Convergency Promoters (CPs pada permukaannya. Penelitian ini dilakukan dengan metode simulasi numerik dengan software Fluent 6.3.26. Simulasi ini dikondisikan dengan menggunakan model turbulensi k-ε RNG dan metode second-order upwind scheme. Pada penelitian ini yang divariasikan adalah Reynolds number berbasis diameter tube, yaitu 3000, 4000 dan 5000, dengan ukuran CPs, ℓ/D = 0,25, pada tube banks yang tersusun secara staggered . Fluida kerja yang digunakan adalah udara yang dimodelkan sebagai gas ideal yang mengalir melintasi celah antar tube dengan temperatur inlet 347.14 K dan temperatur tube (air konstan sebesar 310.5 K. Dari hasil simulasi ini didapatkan visualisasi kontur kecepatan, temperatur dan visualisasi pola aliran yang terbentuk serta pembuktian hipotesa bahwa dengan adanya penambahan CPs akan meningkatkan perpindahan panas. Dimana, model modified akan meningkatkan nilai Nusselt number dan koefisien konveksi sebesar 47- 63% daripada model baseline (tanpa penambahan CPs.

  17. Comparative characterization of the virulence gene clusters (lipooligosacharide [LOS] and capsular polysaccharide [CPS]) for Campylobacter coli, Campylobacter jejuni subsp. jejuni and related Campylobacter species

    Science.gov (United States)

    Richards, Vincent P.; Lefébure, Tristan; Pavinski Bitar, Paulina D.; Stanhope, Michael J.

    2013-01-01

    Campylobacter jejuni subsp. jejuni and Campylobacter coli are leading causes of gastroenteritis, with virulence linked to cell surface carbohydrate diversity. Although the associated gene clusters are well studied for C. jejuni subsp. jejuni, C. coli has been largely neglected. Here we provide comparative analysis of the lipooligosacharide (LOS) and capsular polysaccharide (CPS) gene clusters, using genome and cluster sequence data for 36 C. coli strains, 67 C. jejuni subsp. jejuni strains and ten additional Campylobacter species. Similar to C. jejuni subsp. jejuni, C. coli showed high LOS/CPS gene diversity, with each cluster delineated into eight gene content classes. This diversity was predominantly due to extensive gene gain/loss, with the lateral transfer of genes likely occurring both within and between species and also between the LOS and CPS. Additional mechanisms responsible for LOS/CPS diversity included phase-variable homopolymeric repeats, gene duplication/inactivation, and possibly host environment selection pressure. Analyses also showed that (i) strains of C. coli and Campylobacter upsaliensis possessed genes homologous to the sialic acid genes implicated in the neurological disorder Guillain Barré syndrome (GBS), and (ii) C. coli LOS classes were differentiated between bovine and poultry hosts, potentially aiding post infection source tracking. PMID:23279811

  18. Comparative characterization of the virulence gene clusters (lipooligosaccharide [LOS] and capsular polysaccharide [CPS]) for Campylobacter coli, Campylobacter jejuni subsp. jejuni and related Campylobacter species.

    Science.gov (United States)

    Richards, Vincent P; Lefébure, Tristan; Pavinski Bitar, Paulina D; Stanhope, Michael J

    2013-03-01

    Campylobacter jejuni subsp. jejuni and Campylobacter coli are leading causes of gastroenteritis, with virulence linked to cell surface carbohydrate diversity. Although the associated gene clusters are well studied for C. jejuni subsp. jejuni, C. coli has been largely neglected. Here we provide comparative analysis of the lipooligosaccharide (LOS) and capsular polysaccharide (CPS) gene clusters, using genome and cluster sequence data for 36 C. coli strains, 67 C. jejuni subsp. jejuni strains and ten additional Campylobacter species. Similar to C. jejuni subsp. jejuni, C. coli showed high LOS/CPS gene diversity, with each cluster delineated into eight gene content classes. This diversity was predominantly due to extensive gene gain/loss, with the lateral transfer of genes likely occurring both within and between species and also between the LOS and CPS. Additional mechanisms responsible for LOS/CPS diversity included phase-variable homopolymeric repeats, gene duplication/inactivation, and possibly host environment selection pressure. Analyses also showed that (i) strains of C. coli and Campylobacter upsaliensis possessed genes homologous to the sialic acid genes implicated in the neurological disorder Guillain-Barré syndrome (GBS), and (ii) C. coli LOS classes were differentiated between bovine and poultry hosts, potentially aiding post infection source tracking. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Analysis of ent-copalyl diphosphate aynthase and ent-kaurene synthase (CPS & KS) gene family in cotton genome databases%棉花基因组数据库中CPS & KS基因的查找与分析

    Institute of Scientific and Technical Information of China (English)

    赵亮; 狄佳春; 陈旭升

    2016-01-01

    The study focused on the distribution of ent⁃copalyl diphosphate aynthase and ent⁃kaurene synthase (CPS&KS), the key enzymes for biosynthesis of GA, in cotton genome. The CPS&KS gene family has two conservative struc⁃ture domains, Terpene_synth and Terpene_synth_C, based on the analysis of their amino acid sequences from Arabidopsis thaliana, and the seed files in Pfam database are PF01397 and PF03936, respectively. By using Hmmsearch of HMMER, 72 sequences were harvested in the genome database from Gossypium raimondii. Five genes exhibiting homologies with 10 genes in subgenomes At and Dt of genome TM⁃1 of tetraploid cotton were identified. Based on the alignment of transcriptome database of an ultra⁃dwarf mutant with GA sensitivity before and after GA treatment, two KS genes were found up⁃regulated.%为了探究赤霉素合成关键酶CPS&KS类基因在棉花中的分布情况,分析了已发表的拟南芥CPS&KS基因编码的蛋白质氨基酸序列,发现该类基因存在2个保守结构域,Terpene_synth和Terpene_synth_C,其在Pfam数据库中的种子文件分别为PF01397和PF03936。利用软件HMMER中的Hmmsearch程序在雷蒙德氏棉蛋白质数据库中调取72条序列,通过与拟南芥中的CPS&KS基因序列进行比对,最终在雷蒙德氏棉基因组中筛选到5个CPS&KS候选基因,这5个基因与四倍体棉花TM⁃1基因组At和Dt亚组的10个基因具有同源关系。通过对赤霉素敏感的棉花超矮杆突变体赤霉素处理前后转录组数据库分析比对,最终确定2个KS候选基因在转录组中表现为上调作用。

  20. Healthy aging, memory subsystems and regional cerebral oxygen consumption.

    Science.gov (United States)

    Eustache, F; Rioux, P; Desgranges, B; Marchal, G; Petit-Taboué, M C; Dary, M; Lechevalier, B; Baron, J C

    1995-07-01

    The present study was designed to search for concomitant age-related changes in memory subsystems, defined according to current structural theories, and resting oxygen consumption in selected brain regions. We have investigated a sample of subjects between 20 and 68 years of age and strictly screened for their good health. We applied in the same subjects a battery of neuropsychological tests selected to investigate several memory subsystems, and high-resolution positron imaging with stereotaxic localization to study a purposely limited number of cerebral structures, selected on a priori hypotheses to match the different memory subsystems. Our results showed significant age-related changes in performance on some tests, consistent with the literature, including an increase in semantic memory and a decrease in both working memory (central executive system) and verbal episodic and explicit memory. There was also an age-related linear decrease in global brain oxygen consumption which regionally reached statistical significance for the neocortical areas and the left thalamus. There was a limited number of significant, age-independent correlations between the raw psychometric test scores and resting regional oxidative metabolism. Consistent with our present understanding of the functional anatomy of memory, the Associate Learning scores (verbal episodic and explicit memory) were positively correlated with left hippocampal and thalamic metabolism. The positive relationships found between right hippocampal metabolism and performance in the Associate Learning and the Brown-Peterson tests were less expected but would be consistent with findings from recent PET activation studies. The results from this investigation are discussed in the light of current knowledge concerning the neuropsychology and the neurobiology of both aging and memory.

  1. Information measuring subsystem oil pumping station “Parabel”

    Directory of Open Access Journals (Sweden)

    Nyashina Galina S.

    2014-01-01

    Full Text Available Information-measurement subsystem oil pumping station (OPS “Parabel”, located on the site of the main pipeline “Alexandrov-Anzhero” (OJSC “AK” Transneft "”. Developed on the basis of a modern microprocessor equipment, automation, as well as high-speed digital data channels. The simple solution to meet the requirements set out in the guidance document "Automation and remote control of trunk pipelines. «General provisions» (RD-35.240.0000-KTN-207-08.

  2. A Higher Order Godunov Method for Radiation Hydrodynamics: Radiation Subsystem

    CERN Document Server

    Sekora, Michael

    2009-01-01

    A higher order Godunov method for the radiation subsystem of radiation hydrodynamics is presented. A key ingredient of the method is the direct coupling of stiff source term effects to the hyperbolic structure of the system of conservation laws; it is composed of a predictor step that is based on Duhamel's principle and a corrector step that is based on Picard iteration. The method is second order accurate in both time and space, unsplit, asymptotically preserving, and uniformly well behaved from the photon free streaming (hyperbolic) limit through the weak equilibrium diffusion (parabolic) limit and to the strong equilibrium diffusion (hyperbolic) limit. Numerical tests demonstrate second order convergence across various parameter regimes.

  3. Initial diagnostics commissioning results for the APS injector subsystems

    Science.gov (United States)

    Lumpkin, A.; Chung, Y.; Kahana, E.; Patterson, D.; Sellyey, W.; Smith, T.; Wang, X.

    1995-05-01

    In recent months the first beams have been introduced into the various injector subsystems of the Advanced Photon Source (APS). An overview will be given of the diagnostics results on beam profiling, beam position monitors (BPMs), loss rate monitors (LRMs), current monitors (CMs), and photon monitors on the low energy transport lines, positron accumulator ring (PAR), and injector synchrotron (IS). Initial measurements have been done with electron beams at energies from 250 to 450 MeV and 50 to 400 pC per macrobunch. Operations in single turn and stored beam conditions were diagnosed in the PAR and IS.

  4. Recent developments for the Large Binocular Telescope Guiding Control Subsystem

    Science.gov (United States)

    Golota, T.; De La Peña, M. D.; Biddick, C.; Lesser, M.; Leibold, T.; Miller, D.; Meeks, R.; Hahn, T.; Storm, J.; Sargent, T.; Summers, D.; Hill, J.; Kraus, J.; Hooper, S.; Fisher, D.

    2014-07-01

    The Large Binocular Telescope (LBT) has eight Acquisition, Guiding, and wavefront Sensing Units (AGw units). They provide guiding and wavefront sensing capability at eight different locations at both direct and bent Gregorian focal stations. Recent additions of focal stations for PEPSI and MODS instruments doubled the number of focal stations in use including respective motion, camera controller server computers, and software infrastructure communicating with Guiding Control Subsystem (GCS). This paper describes the improvements made to the LBT GCS and explains how these changes have led to better maintainability and contributed to increased reliability. This paper also discusses the current GCS status and reviews potential upgrades to further improve its performance.

  5. Electric and hybrid vehicles environmental control subsystem study

    Science.gov (United States)

    1981-01-01

    An environmental control subsystem (ECS) in the passenger compartment of electric and hybrid vehicles is studied. Various methods of obtaining the desired temperature control for the battery pack is also studied. The functional requirements of ECS equipment is defined. Following categorization by methodology, technology availability and risk, all viable ECS concepts are evaluated. Each is assessed independently for benefits versus risk, as well as for its feasibility to short, intermediate and long term product development. Selection of the preferred concept is made against these requirements, as well as the study's major goal of providing safe, highly efficient and thermally confortable ECS equipment.

  6. Stability conditions of complex switched systems with unstable subsystems

    Institute of Scientific and Technical Information of China (English)

    肖扬

    2004-01-01

    New stability conditions for complex switched systems are presented. We propose the concepts of attractive region and semi-attractive region, which are used as a tool for analyzing the stability of switched systems with unstable subsystems. Based on attractive region the sufficient conditions with less conservative for stability of switched systems have been established, there is no limitation for all members of the system set to be stable. Since our results have considered and utilized the decreasing span of oscillating solutions of the switched systems, they are more practical than the other presented ones of stability of switched systems, and need not resort to multiple Lyapunov functions.

  7. Electric and hybrid vehicle environmental control subsystem study

    Science.gov (United States)

    Heitner, K. L.

    1980-01-01

    An environmental control subsystem (ECS) in electric and hybrid vehicles is studied. A combination of a combustion heater and gasoline engine (Otto cycle) driven vapor compression air conditioner is selected. The combustion heater, the small gasoline engine, and the vapor compression air conditioner are commercially available. These technologies have good cost and performance characteristics. The cost for this ECS is relatively close to the cost of current ECS's. Its effect on the vehicle's propulsion battery is minimal and the ECS size and weight do not have significant impact on the vehicle's range.

  8. The New York Public Library Automated Book Catalog Subsystem

    Directory of Open Access Journals (Sweden)

    S. Michael Malinconico

    1973-03-01

    Full Text Available A comprehensive automated bibliographic control system has been developed by the New York Public Library. This system is unique in its use of an automated authority system and highly sophisticated machine filing algorithms. The primary aim was the rigorous control of established forms and their cross-reference structure. The original impetus for creation of the system, and its most highly visible product, is a photocomposed book catalog. The book catalog subsystem supplies automatic punctuation of condensed entries and contains the ability to pmduce cumulation/ supplement book catalogs in installments without loss of control of the crossreferencing structure.

  9. Data Management Applications for the Service Preparation Subsystem

    Science.gov (United States)

    Luong, Ivy P.; Chang, George W.; Bui, Tung; Allen, Christopher; Malhotra, Shantanu; Chen, Fannie C.; Bui, Bach X.; Gutheinz, Sandy C.; Kim, Rachel Y.; Zendejas, Silvino C.; hide

    2009-01-01

    These software applications provide intuitive User Interfaces (UIs) with a consistent look and feel for interaction with, and control of, the Service Preparation Subsystem (SPS). The elements of the UIs described here are the File Manager, Mission Manager, and Log Monitor applications. All UIs provide access to add/delete/update data entities in a complex database schema without requiring technical expertise on the part of the end users. These applications allow for safe, validated, catalogued input of data. Also, the software has been designed in multiple, coherent layers to promote ease of code maintenance and reuse in addition to reducing testing and accelerating maturity.

  10. Antenna and RF Subsystem Integration in Cellular Communications

    DEFF Research Database (Denmark)

    Herrero, Pablo; Bahramzy, Pevand; Svendsen, Simon

    2014-01-01

    wireless standards and demanding power and size requirements. The first technique is based on integrating the antenna as part of the RF filtering chain to relax the requirements of current duplex filters up to 30dB off-band. We also outline a discussion on the different approaches for adaptive antenna......We discuss in this article a number of techniques that can be used to improve the RF performance on a mobile device. All those techniques rely on tight antenna and modem subsystem codesign. In a short introduction, the article outlines the need of these techniques, based on the advent of new...

  11. Subseasonal prediction of the heat wave of December 2013 in Southern South America by the POAMA and BCC-CPS models

    Science.gov (United States)

    Osman, Marisol; Alvarez, Mariano S.

    2017-03-01

    The prediction skill of subseasonal forecast models is evaluated for a strong and long-lasting heat wave occurred in December 2013 over Southern South America. Reforecasts from two models participating in the WCRP/WWRP Subseasonal to Seasonal project, the Bureau of Meteorology POAMA and Beijing Climate Center model BCC-CPS were considered to evaluate their skill in forecasting temperature and circulation anomalies during that event. The POAMA reforecast of 32-member ensemble size, initialized every five days, and BCC-CPS reforecast of 4-member ensemble size for the same date of POAMA plus the previous 4 days were considered. Weekly ensemble-mean forecasts were computed with leadtimes from 2 days up to 24 days every 5 days. Weekly anomalies were calculated for observations from 13th of December to 31st of December 2013. Anomalies for both observations and reforecast were calculated with respect to their own climatology. Results show that the ensemble mean warm anomalies forecasted for week 1 and 2 of the heat wave resulted more similar to the observations for the POAMA model, especially for longer leads. The BCC-CPS performed better for leads shorter than 7 (14) for week 1 (2). For week 3 the BCC-CPS outperformed the POAMA model, particularly at shorter leads, locating more accurately the maxima of the anomalies. In a probabilistic approach, POAMA predicted with a higher chance than BCC-CPS the excess of the upper tercile of temperature anomalies for almost every week and lead time. The forecast of the circulation anomalies over South America could be used to explain the location of the highest temperature anomalies. In summary, for this case, models skill in forecasting surface temperature in a context of a heat wave resulted moderate at lead times longer than the fortnight. However, this study is limited to model-to-model analysis and a multi-model ensemble strategy might increase the skill.

  12. CPS 硫磺回收工艺在塔中某处理厂的应用%The CPS Application of Sulfur Recovery Process in A Tazhong Area Plant

    Institute of Scientific and Technical Information of China (English)

    曹强

    2014-01-01

    塔中某处理厂硫磺回收采用分流法低温克劳斯工艺,通过一级常规克劳斯和三级低温克劳斯反应完成单质硫的回收和尾气的处理,保证硫收率达到99.25%以上及尾气排放合格。文中对装置的工艺特点进行简单介绍,并对生产过程中遇到的问题与应对措施进行分析和说明,充分证明了CPS硫磺回收工艺非常适合该处理厂的实际情况。%Some plant in Tazhong area using diversion method at low temperature to recovery sulfur , with Claus process, through conventional Claus I and low temperature Claus Ⅲreaction finished processing the sulfur recovery and tail gas, ensure the sulfur recovery rate reached more than 99.25% and the exhaust emissions qualified.A brief introduction on the process features of this device , and the problems encountered in the production process and the countermeasures were analyzed and explained fully proved that the actual situation of CPS sulfur recovery process was very suiTable for the processing plant.

  13. Forecasting of cyclone Viyaru and Phailin by NWP-based cyclone prediction system (CPS) of IMD – an evaluation

    Indian Academy of Sciences (India)

    S D Kotal; S K Bhattacharya; S K Roy Bhowmik; P K Kundu

    2014-10-01

    An objective NWP-based cyclone prediction system (CPS) was implemented for the operational cyclone forecasting work over the Indian seas. The method comprises of five forecast components, namely (a) Cyclone Genesis Potential Parameter (GPP), (b) Multi-Model Ensemble (MME) technique for cyclone track prediction, (c) cyclone intensity prediction, (d) rapid intensification, and (e) predicting decaying intensity after the landfall. GPP is derived based on dynamical and thermodynamical parameters from the model output of IMD operational Global Forecast System. The MME technique for the cyclone track prediction is based on multiple linear regression technique. The predictor selected for the MME are forecast latitude and longitude positions of cyclone at 12-hr intervals up to 120 hours forecasts from five NWP models namely, IMD-GFS, IMD-WRF, NCEP-GFS, UKMO, and JMA. A statistical cyclone intensity prediction (SCIP) model for predicting 12 hourly cyclone intensity (up to 72 hours) is developed applying multiple linear regression technique. Various dynamical and thermodynamical parameters as predictors are derived from the model outputs of IMD operational Global Forecast System and these parameters are also used for the prediction of rapid intensification. For forecast of inland wind after the landfall of a cyclone, an empirical technique is developed. This paper briefly describes the forecast system CPS and evaluates the performance skill for two recent cyclones Viyaru (non-intensifying) and Phailin (rapid intensifying), converse in nature in terms of track and intensity formed over Bay of Bengal in 2013. The evaluation of performance shows that the GPP analysis at early stages of development of a low pressure system indicated the potential of the system for further intensification. The 12-hourly track forecast by MME, intensity forecast by SCIP model, and rapid intensification forecasts are found to be consistent and very useful to the operational forecasters. The error

  14. HEp-2 Cell Classification: The Role of Gaussian Scale Space Theory as A Pre-processing Approach

    OpenAIRE

    Qi, Xianbiao; Zhao, Guoying; Chen, Jie; Pietikäinen, Matti

    2015-01-01

    \\textit{Indirect Immunofluorescence Imaging of Human Epithelial Type 2} (HEp-2) cells is an effective way to identify the presence of Anti-Nuclear Antibody (ANA). Most existing works on HEp-2 cell classification mainly focus on feature extraction, feature encoding and classifier design. Very few efforts have been devoted to study the importance of the pre-processing techniques. In this paper, we analyze the importance of the pre-processing, and investigate the role of Gaussian Scale Space (GS...

  15. 毛萼香茶菜二萜合酶IeCPS的立体特异性研究%Analysis on Stereospeciifty of Diterpene Synthases IeCPS fromIsodon eriocalyx

    Institute of Scientific and Technical Information of China (English)

    杜刚; 龚海艳; 高娟; 付小莉; 卢山; 曾英

    2015-01-01

    Ent-kaurane diterpenoids, isolated from the Chinese medicinal herbs,Isodon L., are principle com-ponents showing potent bioactivities of antitumor and anti-autoimmune inlfammation. Despite a large number ofent-kaurane diterpenoids isolated fromIsodon plants, little is known about the enzymatic machinery involved in their biosynthesis. IeCPS1 and IeCPS2 encoding copalyl diphosphate synthases (CPS) were previ-ously cloned from the I. eriocalyx leaves and functionally characterized. Here we verified that IeCPS1 and IeCPS2 wereent-CPSs that produced CPP ofent-stereochemistry, by GC-MS analysis of enzymatic products from coupled reactions with a knownArabidopsis ent-kaurene synthase stereospeciifc forent-CPP.%对映-贝壳杉烷型二萜是药用植物香茶菜抗菌抗肿瘤的核心组分,其生物合成的酶分子机制尚未完全明了。前期我们克隆并鉴定了毛萼香茶菜二萜合酶柯巴基焦磷酸合酶基因IeCPS1和IeCPS2。本文通过与拟南芥对映-贝壳杉烯二萜合酶AtKS配对进行偶联反应,以及GC-MS鉴定酶联反应产物,证实毛萼香茶菜二萜合酶IeCPS1和IeCPS2的立体特异性为对映型(ent-CPS)。

  16. Multi-agent Based Charges subsystem for Supply Chain Logistics

    Directory of Open Access Journals (Sweden)

    Pankaj Rani

    2012-05-01

    Full Text Available The main objective of this paper is to design charges subsystem using multi agent technology which deals with calculation, accrual and collection of various charges levied at the goods in a supply chain Logistics. Accrual of various charges such as freight, demurrage, and wharfage take place implicitly in the SC system at the various events of different subsystems which is collected and calculated by software agents. An Agent-based modeling is an approach based on the idea that a system is composed of decentralized individual ‘agents’ and that each agent interacts with other agents according to its localized knowledge. Our aim is to design a flexible architecture that can deal with next generation supply chain problems based on a multi-agent architecture. In this article, a multi agent system has been developed to calculate charges levied at various stages on good sheds.. Each entity is modeled as one agent and their coordination lead to control inventories and minimize the total cost of SC by sharing information and forecasting knowledge and using negotiation mechanism.

  17. Optimisation study of a vehicle bumper subsystem with fuzzy parameters

    Science.gov (United States)

    Farkas, L.; Moens, D.; Donders, S.; Vandepitte, D.

    2012-10-01

    This paper deals with the design and optimisation for crashworthiness of a vehicle bumper subsystem, which is a key scenario for vehicle component design. The automotive manufacturers and suppliers have to find optimal design solutions for such subsystems that comply with the conflicting requirements of the regulatory bodies regarding functional performance (safety and repairability) and regarding the environmental impact (mass). For the bumper design challenge, an integrated methodology for multi-attribute design engineering of mechanical structures is set up. The integrated process captures the various tasks that are usually performed manually, this way facilitating the automated design iterations for optimisation. Subsequently, an optimisation process is applied that takes the effect of parametric uncertainties into account, such that the system level of failure possibility is acceptable. This optimisation process is referred to as possibility-based design optimisation and integrates the fuzzy FE analysis applied for the uncertainty treatment in crash simulations. This process is the counterpart of the reliability-based design optimisation used in a probabilistic context with statistically defined parameters (variabilities).

  18. Rotating field collector subsystem phase 1 study and evaluation

    Science.gov (United States)

    Jones, D.; Eibling, J. A.

    1982-10-01

    The rotating field collector system is an alternative concept in which all heliostats are mounted on a single large platform which rotates around a tower to track the azumuthal angle of the Sun. Each heliostat is mounted to the platform with appropriate pivots, linkage, and controls to provide the additional positioning required to properly direct the solar radiation onto the receiver. The results are presented of the first phase of a study to investigate the technical and economic merits of a particular type of rotating field collector subsystem. The large pie-shaped platform would revolve over an array of support pedestals by means of a roller at the top of each pedestal. Several heliostats were built to demonstrate their construction features, and the operation of both flat and amphitheater rotating fields was studied. Work included an analysis of the concepts, development of modifications and additions to make the system comply with design criteria, and cost estimates to be used for comparison with other heliostat subsystems. Because of considerably high cost estimates, the focus of a large part of the study was directed toward developing lower cost designs of major components.

  19. Lunar Advanced Volatile Analysis Subsystem: Pressure Transducer Trade Study

    Science.gov (United States)

    Kang, Edward Shinuk

    2017-01-01

    In Situ Resource Utilization (ISRU) is a key factor in paving the way for the future of human space exploration. The ability to harvest resources on foreign astronomical objects to produce consumables and propellant offers potential reduction in mission cost and risk. Through previous missions, the existence of water ice at the poles of the moon has been identified, however the feasibility of water extraction for resources remains unanswered. The Resource Prospector (RP) mission is currently in development to provide ground truth, and will enable us to characterize the distribution of water at one of the lunar poles. Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) is the primary payload on RP that will be used in conjunction with a rover. RESOLVE contains multiple instruments for systematically identifying the presence of water. The main process involves the use of two systems within RESOLVE: the Oxygen Volatile Extraction Node (OVEN) and Lunar Advanced Volatile Analysis (LAVA). Within the LAVA subsystem, there are multiple calculations that depend on accurate pressure readings. One of the most important instances where pressure transducers (PT) are used is for calculating the number of moles in a gas transfer from the OVEN subsystem. As a critical component of the main process, a mixture of custom and commercial off the shelf (COTS) PTs are currently being tested in the expected operating environment to eventually down select an option for integrated testing in the LAVA engineering test unit (ETU).

  20. Pre-Processing Effect on the Accuracy of Event-Based Activity Segmentation and Classification through Inertial Sensors

    Directory of Open Access Journals (Sweden)

    Benish Fida

    2015-09-01

    Full Text Available Inertial sensors are increasingly being used to recognize and classify physical activities in a variety of applications. For monitoring and fitness applications, it is crucial to develop methods able to segment each activity cycle, e.g., a gait cycle, so that the successive classification step may be more accurate. To increase detection accuracy, pre-processing is often used, with a concurrent increase in computational cost. In this paper, the effect of pre-processing operations on the detection and classification of locomotion activities was investigated, to check whether the presence of pre-processing significantly contributes to an increase in accuracy. The pre-processing stages evaluated in this study were inclination correction and de-noising. Level walking, step ascending, descending and running were monitored by using a shank-mounted inertial sensor. Raw and filtered segments, obtained from a modified version of a rule-based gait detection algorithm optimized for sequential processing, were processed to extract time and frequency-based features for physical activity classification through a support vector machine classifier. The proposed method accurately detected >99% gait cycles from raw data and produced >98% accuracy on these segmented gait cycles. Pre-processing did not substantially increase classification accuracy, thus highlighting the possibility of reducing the amount of pre-processing for real-time applications.

  1. Complex and magnitude-only preprocessing of 2D and 3D BOLD fMRI data at 7 T.

    Science.gov (United States)

    Barry, Robert L; Strother, Stephen C; Gore, John C

    2012-03-01

    A challenge to ultra high field functional magnetic resonance imaging is the predominance of noise associated with physiological processes unrelated to tasks of interest. This degradation in data quality may be partially reversed using a series of preprocessing algorithms designed to retrospectively estimate and remove the effects of these noise sources. However, such algorithms are routinely validated only in isolation, and thus consideration of their efficacies within realistic preprocessing pipelines and on different data sets is often overlooked. We investigate the application of eight possible combinations of three pseudo-complementary preprocessing algorithms - phase regression, Stockwell transform filtering, and retrospective image correction - to suppress physiological noise in 2D and 3D functional data at 7 T. The performance of each preprocessing pipeline was evaluated using data-driven metrics of reproducibility and prediction. The optimal preprocessing pipeline for both 2D and 3D functional data included phase regression, Stockwell transform filtering, and retrospective image correction. This result supports the hypothesis that a complex preprocessing pipeline is preferable to a magnitude-only pipeline, and suggests that functional magnetic resonance imaging studies should retain complex images and externally monitor subjects' respiratory and cardiac cycles so that these supplementary data may be used to retrospectively reduce noise and enhance overall data quality.

  2. On the subsystem formulation of linear-response time-dependent DFT.

    Science.gov (United States)

    Pavanello, Michele

    2013-05-28

    A new and thorough derivation of linear-response subsystem time-dependent density functional theory (TD-DFT) is presented and analyzed in detail. Two equivalent derivations are presented and naturally yield self-consistent subsystem TD-DFT equations. One reduces to the subsystem TD-DFT formalism of Neugebauer [J. Chem. Phys. 126, 134116 (2007)]. The other yields Dyson type equations involving three types of subsystem response functions: coupled, uncoupled, and Kohn-Sham. The Dyson type equations for subsystem TD-DFT are derived here for the first time. The response function formalism reveals previously hidden qualities and complications of subsystem TD-DFT compared with the regular TD-DFT of the supersystem. For example, analysis of the pole structure of the subsystem response functions shows that each function contains information about the electronic spectrum of the entire supersystem. In addition, comparison of the subsystem and supersystem response functions shows that, while the correlated response is subsystem additive, the Kohn-Sham response is not. Comparison with the non-subjective partition DFT theory shows that this non-additivity is largely an artifact introduced by the subjective nature of the density partitioning in subsystem DFT.

  3. THE ANALYSIS OF BEEF CATTLE SUBSYSTEM AGRIBUSINESS IMPLEMENTATION IN CENTRAL JAVA PROVINCE, INDONESIA

    Directory of Open Access Journals (Sweden)

    T. Ekowati

    2011-12-01

    Full Text Available The study aimed to analyze the implementation of subsystem agribusiness on the beef cattle farming in Central Java. Five districts (Rembang, Blora, Grobogan, Boyolali and Wonogiri were purposively chosen based on the value of Location Quotient (LQ. The study was conducted using quota sampling method. Forty respondents of each district were chosen randomly using quota sampling. Data were analyzed through Structural Equation Model (SEM. The results showed that each subsystem agribusiness had adequate potential score. The score of 0.693, 0.721, 0.684, 0.626, and 0.691 were given for up-stream subsystem, on-farm, down-stream subsystem, marketing and supporting institution, respectively. The results showed that the SEM model was feasible with Chi-Square value=0.952; RMSEA=0.000; Probability =0.621 and TL1=1.126. The significant results of Critical Ratio (CR were: up-stream subsystem to the on-farm agribusiness; on-farm subsystem to down-stream agribusiness; down-stream subsystem to the farmer’s income; marketing subsystem to the up-stream agribusiness and Supporting Institution to the marketing subsystem and down-stream agribusiness. The conclusion of research indicated that the implementation of beef cattle subsystem agribusiness had adequate index and give positive effect to the beef cattle agribusiness.

  4. The development of the intrinsic functional connectivity of default network subsystems from age 3 to 5.

    Science.gov (United States)

    Xiao, Yaqiong; Zhai, Hongchang; Friederici, Angela D; Jia, Fucang

    2016-03-01

    In recent years, research on human functional brain imaging using resting-state fMRI techniques has been increasingly prevalent. The term "default mode" was proposed to describe a baseline or default state of the brain during rest. Recent studies suggested that the default mode network (DMN) is comprised of two functionally distinct subsystems: a dorsal-medial prefrontal cortex (DMPFC) subsystem involved in self-oriented cognition (i.e., theory of mind) and a medial temporal lobe (MTL) subsystem engaged in memory and scene construction; both subsystems interact with the anterior medial prefrontal cortex (aMPFC) and posterior cingulate (PCC) as the core regions of DMN. The present study explored the development of DMN core regions and these two subsystems in both hemispheres from 3- to 5-year-old children. The analysis of the intrinsic activity showed strong developmental changes in both subsystems, and significant changes were specifically found in MTL subsystem, but not in DMPFC subsystem, implying distinct developmental trajectories for DMN subsystems. We found stronger interactions between the DMPFC and MTL subsystems in 5-year-olds, particularly in the left subsystems that support the development of environmental adaptation and relatively complex mental activities. These results also indicate that there is stronger right hemispheric lateralization at age 3, which then changes as bilateral development gradually increases through to age 5, suggesting in turn the hemispheric dominance in DMN subsystems changing with age. The present results provide primary evidence for the development of DMN subsystems in early life, which might be closely related to the development of social cognition in childhood.

  5. LytR-CpsA-Psr enzymes as determinants of Bacillus anthracis secondary cell wall polysaccharide assembly.

    Science.gov (United States)

    Liszewski Zilla, Megan; Chan, Yvonne G Y; Lunderberg, Justin Mark; Schneewind, Olaf; Missiakas, Dominique

    2015-01-01

    Bacillus anthracis, the causative agent of anthrax, replicates as chains of vegetative cells by regulating the separation of septal peptidoglycan. Surface (S)-layer proteins and associated proteins (BSLs) function as chain length determinants and bind to the secondary cell wall polysaccharide (SCWP). In this study, we identified the B. anthracis lcpD mutant, which displays increased chain length and S-layer assembly defects due to diminished SCWP attachment to peptidoglycan. In contrast, the B. anthracis lcpB3 variant displayed reduced cell size and chain length, which could be attributed to increased deposition of BSLs. In other bacteria, LytR-CpsA-Psr (LCP) proteins attach wall teichoic acid (WTA) and polysaccharide capsule to peptidoglycan. B. anthracis does not synthesize these polymers, yet its genome encodes six LCP homologues, which, when expressed in S. aureus, promote WTA attachment. We propose a model whereby B. anthracis LCPs promote attachment of SCWP precursors to discrete locations in the peptidoglycan, enabling BSL assembly and regulated separation of septal peptidoglycan.

  6. EARLINET Single Calculus Chain - technical - Part 1: Pre-processing of raw lidar data

    Science.gov (United States)

    D'Amico, Giuseppe; Amodeo, Aldo; Mattis, Ina; Freudenthaler, Volker; Pappalardo, Gelsomina

    2016-02-01

    In this paper we describe an automatic tool for the pre-processing of aerosol lidar data called ELPP (EARLINET Lidar Pre-Processor). It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC), the automatic tool for the analysis of EARLINET data. ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of ELPP, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of ELPP is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of ELPP. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. ELPP has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  7. AN ENHANCED PRE-PROCESSING RESEARCH FRAMEWORK FOR WEB LOG DATA USING A LEARNING ALGORITHM

    Directory of Open Access Journals (Sweden)

    V.V.R. Maheswara Rao

    2011-01-01

    Full Text Available With the continued growth and proliferation of Web services and Web based information systems, the volumes of user data have reached astronomical proportions. Before analyzing such data using web mining techniques, the web log has to be pre processed, integrated and transformed. As the World Wide Web is continuously and rapidly growing, it is necessary for the web miners to utilize intelligent tools in order to find, extract, filter and evaluate the desired information. The data pre-processing stage is the most important phase for investigation of the web user usage behaviour. To do this one must extract the only human user accesses from weblog data which is critical and complex. The web log is incremental in nature, thus conventional data pre-processing techniques were proved to be not suitable. Hence an extensive learning algorithm is required in order to get the desired information.This paper introduces an extensive research frame work capable of pre processing web log data completely and efficiently. The learning algorithm of proposed research frame work can separates human user and search engine accesses intelligently, with less time. In order to create suitable target data, the further essential tasks of pre-processing Data Cleansing, User Identification, Sessionization and Path Completion are designed collectively. The framework reduces the error rate and improves significant learning performance of the algorithm. The work ensures the goodness of split by using popular measures like Entropy and Gini index. This framework helps to investigate the web user usage behaviour efficiently. The experimental results proving this claim are given in this paper.

  8. EARLINET Single Calculus Chain – technical – Part 1: Pre-processing of raw lidar data

    Directory of Open Access Journals (Sweden)

    G. D'Amico

    2015-10-01

    Full Text Available In this paper we describe an automatic tool for the pre-processing of lidar data called ELPP (EARLINET Lidar Pre-Processor. It is one of two calculus modules of the EARLINET Single Calculus Chain (SCC, the automatic tool for the analysis of EARLINET data. The ELPP is an open source module that executes instrumental corrections and data handling of the raw lidar signals, making the lidar data ready to be processed by the optical retrieval algorithms. According to the specific lidar configuration, the ELPP automatically performs dead-time correction, atmospheric and electronic background subtraction, gluing of lidar signals, and trigger-delay correction. Moreover, the signal-to-noise ratio of the pre-processed signals can be improved by means of configurable time integration of the raw signals and/or spatial smoothing. The ELPP delivers the statistical uncertainties of the final products by means of error propagation or Monte Carlo simulations. During the development of the ELPP module, particular attention has been payed to make the tool flexible enough to handle all lidar configurations currently used within the EARLINET community. Moreover, it has been designed in a modular way to allow an easy extension to lidar configurations not yet implemented. The primary goal of the ELPP module is to enable the application of quality-assured procedures in the lidar data analysis starting from the raw lidar data. This provides the added value of full traceability of each delivered lidar product. Several tests have been performed to check the proper functioning of the ELPP module. The whole SCC has been tested with the same synthetic data sets, which were used for the EARLINET algorithm inter-comparison exercise. The ELPP module has been successfully employed for the automatic near-real-time pre-processing of the raw lidar data measured during several EARLINET inter-comparison campaigns as well as during intense field campaigns.

  9. Classification-based comparison of pre-processing methods for interpretation of mass spectrometry generated clinical datasets

    Directory of Open Access Journals (Sweden)

    Hoefsloot Huub CJ

    2009-05-01

    Full Text Available Abstract Background Mass spectrometry is increasingly being used to discover proteins or protein profiles associated with disease. Experimental design of mass-spectrometry studies has come under close scrutiny and the importance of strict protocols for sample collection is now understood. However, the question of how best to process the large quantities of data generated is still unanswered. Main challenges for the analysis are the choice of proper pre-processing and classification methods. While these two issues have been investigated in isolation, we propose to use the classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Results Two in-house generated clinical SELDI-TOF MS datasets are used in this study as an example of high throughput mass-spectrometry data. We perform a systematic comparison of two commonly used pre-processing methods as implemented in Ciphergen ProteinChip Software and in the Cromwell package. With respect to reproducibility, Ciphergen and Cromwell pre-processing are largely comparable. We find that the overlap between peaks detected by either Ciphergen ProteinChip Software or Cromwell is large. This is especially the case for the more stringent peak detection settings. Moreover, similarity of the estimated intensities between matched peaks is high. We evaluate the pre-processing methods using five different classification methods. Classification is done in a double cross-validation protocol using repeated random sampling to obtain an unbiased estimate of classification accuracy. No pre-processing method significantly outperforms the other for all peak detection settings evaluated. Conclusion We use classification of patient samples as a clinically relevant benchmark for the evaluation of pre-processing methods. Both pre-processing methods lead to similar classification results on an ovarian cancer and a Gaucher disease dataset. However, the settings for pre-processing

  10. Subsystem Quantum Mechanics and its Applications to Crystalline Systems

    Science.gov (United States)

    Zou, Pengfei

    This thesis reports results of the author's investigations along the theme that both dynamic and static properties of molecules and solids can be expressed in terms of their parts from theoretical and applied aspects. Specifically, the following four main results are obtained: (1) A topological analysis of the charge density in crystals has been developed. This is an extension of the theory of molecular structure to crystalline systems. Relationships between the bulk properties of a crystal and its topological structure have been established. A comparison of the topological properties of molecules and crystals have been made. (2) The theory of atoms in molecules has been extended to a crystal and yields a variational definition of a Wigner-Seitz cell. This definition maximizes the relation of the cell to the physical form exhibited by the charge density and the derived structure factors that account, in a natural way, for the observed intensities of scattered electrons and X-rays. It has been demonstrated that the theory of atoms in molecules and crystals can provide a way to model the behaviour of solids. This is done through the use of the fact that atomic properties are often transferable from one system to another. (3) The subsystem variational principle has been reformulated in terms of quantum field theoretical language and the subsystem Feynman path integrals of electrons have been obtained using the coherent representation. This part contributes to the foundation of the theory of atoms in molecules and crystals. (4) Both dynamic and static quantum mechanical subspace techniques have been extensively investigated. A new variational method has been derived for embedding one system in another using the R-matrix formalism within the density functional approach. A formal subspace perturbation scheme has been proposed. These methods aim to obtain the charge distribution of a subsystem starting from known reference systems. Before I came here I was confused about

  11. Internet use during childhood and the ecological techno-subsystem

    Directory of Open Access Journals (Sweden)

    Genevieve Marie Johnson

    2008-12-01

    Full Text Available Research findings suggest both positive and negative developmental consequences of Internet use during childhood (e.g., playing video games have been associated with enhanced visual skills as well as increased aggression. Several studies have concluded that environmental factors mediate the developmental impact of childhood online behaviour. From an ecological perspective, we propose the techno-subsystem, a dimension of the microsystem (i.e., immediate environments. The techno-subsystem includes child interaction with both living (e.g., peers and nonliving (e.g., hardware elements of communication, information, and recreation technologies in direct environments. By emphasizing the role of technology in child development, the ecological techno-subsystem encourages holistic exploration of the developmental consequences of Internet use (and future technological advances during childhood. L’usage d’Internet chez les enfants et le sous-système Techno écologique Résumé : Les résultats de recherche semblent indiquer que l’usage d’Internet chez les enfants aurait des conséquences développementales qui soit à la fois positives et négatives (ex. : l’usage des jeux vidéo auraient été associés à un accroissement des habileté visuelles ainsi qu’à un accroissement de l’agressivité. Plusieurs études ont aussi conclue que l’impact du comportement des enfants quand il sont en ligne sur leur développement serait affecté par des facteurs environnementaux. Dans une perspective écologique, nous proposons le sous-système Techno, une dimension du microsystème (ex :. les environnements immédiats. Le sous-système Techno comprend l’interaction de l’enfant avec des éléments vivants (e. : les paires et non vivants (ex; les ordinateurs de communication, d’information et de technologie de jeux dans des environnements directes.

  12. Extension of the statistical modal energy distribution analysis for estimating energy density in coupled subsystems

    Science.gov (United States)

    Totaro, N.; Guyader, J. L.

    2012-06-01

    The present article deals with an extension of the Statistical modal Energy distribution Analysis (SmEdA) method to estimate kinetic and potential energy density in coupled subsystems. The SmEdA method uses the modal bases of uncoupled subsystems and focuses on the modal energies rather than the global energies of subsystems such as SEA (Statistical Energy Analysis). This method permits extending SEA to subsystems with low modal overlap or to localized excitations as it does not assume the existence of modal energy equipartition. We demonstrate that by using the modal energies of subsystems computed by SmEdA, it is possible to estimate energy distribution in subsystems. This approach has the same advantages of standard SEA, as it uses very short calculations to analyze damping effects. The estimation of energy distribution from SmEdA is applied to an academic case and an industrial example.

  13. A development and integration analysis of commercial and in-house control subsystems

    Energy Technology Data Exchange (ETDEWEB)

    Moore, D.M. [Westinghouse Savannah River Co., Aiken, SC (United States); Dalesio, L.R. [Los Alamos National Lab., NM (United States)

    1998-12-31

    The acquisition and integration of commercial automation and control subsystems in physics research is becoming more common. It is presumed these systems present lower risk and less cost. This paper studies four subsystems used in the Accelerator Production of Tritium (APT) Low Energy Demonstration Accelerator (LEDA) at the Los Alamos National Laboratory (LANL). The radio frequency quadrupole (RFQ) resonance-control cooling subsystem (RCCS), the high-power RF subsystem and the RFQ vacuum subsystem were outsourced; the low-level RF (LLRF) subsystem was developed in-house. Based on the authors experience a careful evaluation of the costs and risks in acquisition, implementation, integration, and maintenance associated with these approaches is given.

  14. Understanding the requirements imposed by programming model middleware on a common communication subsystem.

    Energy Technology Data Exchange (ETDEWEB)

    Buntinas, D.; Gropp, W.

    2005-12-13

    In high-performance parallel computing, most programming-model middleware libraries and runtime systems use a communication subsystem to abstract the lower-level network layer. The functionality required of a communication subsystem depends largely on the programming model implemented by the middleware. In order to maximize performance, middleware libraries and runtime systems typically implement their own communication subsystems that are specially tuned for the middleware, rather than use an existing communication subsystem. This situation leads to duplicated effort and prevents different middleware libraries from being used by the same application in hybrid programming models. In this paper we describe features required by various middleware libraries as well as some desirable features that would make it easier to port a middleware library to the communication subsystem and allow the middleware to make use of high-performance features provided by some networking layers. We show that none of the communication subsystems that we evaluate support all of the features.

  15. Comparative Evaluation of Preprocessing Freeware on Chromatography/Mass Spectrometry Data for Signature Discovery

    Energy Technology Data Exchange (ETDEWEB)

    Coble, Jamie B.; Fraga, Carlos G.

    2014-07-07

    Preprocessing software is crucial for the discovery of chemical signatures in metabolomics, chemical forensics, and other signature-focused disciplines that involve analyzing large data sets from chemical instruments. Here, four freely available and published preprocessing tools known as metAlign, MZmine, SpectConnect, and XCMS were evaluated for impurity profiling using nominal mass GC/MS data and accurate mass LC/MS data. Both data sets were previously collected from the analysis of replicate samples from multiple stocks of a nerve-agent precursor. Each of the four tools had their parameters set for the untargeted detection of chromatographic peaks from impurities present in the stocks. The peak table generated by each preprocessing tool was analyzed to determine the number of impurity components detected in all replicate samples per stock. A cumulative set of impurity components was then generated using all available peak tables and used as a reference to calculate the percent of component detections for each tool, in which 100% indicated the detection of every component. For the nominal mass GC/MS data, metAlign performed the best followed by MZmine, SpectConnect, and XCMS with detection percentages of 83, 60, 47, and 42%, respectively. For the accurate mass LC/MS data, the order was metAlign, XCMS, and MZmine with detection percentages of 80, 45, and 35%, respectively. SpectConnect did not function for the accurate mass LC/MS data. Larger detection percentages were obtained by combining the top performer with at least one of the other tools such as 96% by combining metAlign with MZmine for the GC/MS data and 93% by combining metAlign with XCMS for the LC/MS data. In terms of quantitative performance, the reported peak intensities had average absolute biases of 41, 4.4, 1.3 and 1.3% for SpectConnect, metAlign, XCMS, and MZmine, respectively, for the GC/MS data. For the LC/MS data, the average absolute biases were 22, 4.5, and 3.1% for metAlign, MZmine, and XCMS

  16. A Multi-channel Pre-processing Circuit for Signals from Thermocouple/Thermister

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper,a new developed multi-channel pre-processing circuit for signals from temperature sensor was introduced in brief.This circuit was developed to collect and amplify the signals from temperature sensor.This is a universal circuit.It can be used to process the signals from thermocouples and also used to process signals from thermistors.This circuit was mounted in a standard box(440W×405D×125H mm)as an instrument.The

  17. Experimental examination of similarity measures and preprocessing methods used for image registration

    Science.gov (United States)

    Svedlow, M.; Mcgillem, C. D.; Anuta, P. E.

    1976-01-01

    The criterion used to measure the similarity between images and thus find the position where the images are registered is examined. The three similarity measures considered are the correlation coefficient, the sum of the absolute differences, and the correlation function. Three basic types of preprocessing are then discussed: taking the magnitude of the gradient of the images, thresholding the images at their medians, and thresholding the magnitude of the gradient of the images at an arbitrary level to be determined experimentally. These multitemporal registration techniques are applied to remote imagery of agricultural areas.

  18. Preprocessing for Optimization of Probabilistic-Logic Models for Sequence Analysis

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    2009-01-01

    , the original complex models may be used for generating artificial evaluation data by efficient sampling, which can be used in the evaluation, although it does not constitute a foolproof test procedure. These models and evaluation processes are illustrated in the PRISM system developed by other authors, and we...... and approximation are needed. The first steps are taken towards a methodology for optimizing such models by approximations using auxiliary models for preprocessing or splitting them into submodels. Evaluation of such approximating models is challenging as authoritative test data may be sparse. On the other hand...

  19. Combined principal component preprocessing and n-tuple neural networks for improved classification

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Linneberg, Christian

    2000-01-01

    We present a combined principal component analysis/neural network scheme for classification. The data used to illustrate the method consist of spectral fluorescence recordings from seven different production facilities, and the task is to relate an unknown sample to one of these seven factories....... The data are first preprocessed by performing an individual principal component analysis on each of the seven groups of data. The components found are then used for classifying the data, but instead of making a single multiclass classifier, we follow the ideas of turning a multiclass problem into a number...

  20. Fast randomized point location without preprocessing in two- and three-dimensional Delaunay triangulations

    Energy Technology Data Exchange (ETDEWEB)

    Muecke, E.P.; Saias, I.; Zhu, B.

    1996-05-01

    This paper studies the point location problem in Delaunay triangulations without preprocessing and additional storage. The proposed procedure finds the query point simply by walking through the triangulation, after selecting a good starting point by random sampling. The analysis generalizes and extends a recent result of d = 2 dimensions by proving this procedure to take expected time close to O(n{sup 1/(d+1)}) for point location in Delaunay triangulations of n random points in d = 3 dimensions. Empirical results in both two and three dimensions show that this procedure is efficient in practice.

  1. Interest rate prediction: a neuro-hybrid approach with data preprocessing

    Science.gov (United States)

    Mehdiyev, Nijat; Enke, David

    2014-07-01

    The following research implements a differential evolution-based fuzzy-type clustering method with a fuzzy inference neural network after input preprocessing with regression analysis in order to predict future interest rates, particularly 3-month T-bill rates. The empirical results of the proposed model is compared against nonparametric models, such as locally weighted regression and least squares support vector machines, along with two linear benchmark models, the autoregressive model and the random walk model. The root mean square error is reported for comparison.

  2. Reservoir computing with a slowly modulated mask signal for preprocessing using a mutually coupled optoelectronic system

    Science.gov (United States)

    Tezuka, Miwa; Kanno, Kazutaka; Bunsen, Masatoshi

    2016-08-01

    Reservoir computing is a machine-learning paradigm based on information processing in the human brain. We numerically demonstrate reservoir computing with a slowly modulated mask signal for preprocessing by using a mutually coupled optoelectronic system. The performance of our system is quantitatively evaluated by a chaotic time series prediction task. Our system can produce comparable performance with reservoir computing with a single feedback system and a fast modulated mask signal. We showed that it is possible to slow down the modulation speed of the mask signal by using the mutually coupled system in reservoir computing.

  3. Comparative evaluation of preprocessing freeware on chromatography/mass spectrometry data for signature discovery.

    Science.gov (United States)

    Coble, Jamie B; Fraga, Carlos G

    2014-09-01

    Preprocessing software, which converts large instrumental data sets into a manageable format for data analysis, is crucial for the discovery of chemical signatures in metabolomics, chemical forensics, and other signature-focused disciplines. Here, four freely available and published preprocessing tools known as MetAlign, MZmine, SpectConnect, and XCMS were evaluated for impurity profiling using nominal mass GC/MS data and accurate mass LC/MS data. Both data sets were previously collected from the analysis of replicate samples from multiple stocks of a nerve-agent precursor and method blanks. Parameters were optimized for each of the four tools for the untargeted detection, matching, and cataloging of chromatographic peaks from impurities present in the stock samples. The peak table generated by each preprocessing tool was analyzed to determine the number of impurity components detected in all replicate samples per stock and absent in the method blanks. A cumulative set of impurity components was then generated using all available peak tables and used as a reference to calculate the percent of component detections for each tool, in which 100% indicated the detection of every known component present in a stock. For the nominal mass GC/MS data, MetAlign had the most component detections followed by MZmine, SpectConnect, and XCMS with detection percentages of 83, 60, 47, and 41%, respectively. For the accurate mass LC/MS data, the order was MetAlign, XCMS, and MZmine with detection percentages of 80, 45, and 35%, respectively. SpectConnect did not function for the accurate mass LC/MS data. Larger detection percentages were obtained by combining the top performer with at least one of the other tools such as 96% by combining MetAlign with MZmine for the GC/MS data and 93% by combining MetAlign with XCMS for the LC/MS data. In terms of quantitative performance, the reported peak intensities from each tool had averaged absolute biases (relative to peak intensities obtained

  4. Computer-assisted bone age assessment: image preprocessing and epiphyseal/metaphyseal ROI extraction.

    Science.gov (United States)

    Pietka, E; Gertych, A; Pospiech, S; Cao, F; Huang, H K; Gilsanz, V

    2001-08-01

    Clinical assessment of skeletal maturity is based on a visual comparison of a left-hand wrist radiograph with atlas patterns. Using a new digital hand atlas an image analysis methodology is being developed. To assist radiologists in bone age estimation. The analysis starts with a preprocessing function yielding epiphyseal/metaphyseal regions of interest (EMROIs). Then, these regions are subjected to a feature extraction function. Accuracy has been measured independently at three stages of the image analysis: detection of phalangeal tip, extraction of the EMROIs, and location of diameters and lower edge of the EMROIs. Extracted features describe the stage of skeletal development more objectively than visual comparison.

  5. Mapping of electrical potentials from the chest surface - preprocessing and visualization

    Directory of Open Access Journals (Sweden)

    Vaclav Chudacek

    2005-01-01

    Full Text Available The aim of the paper is to present current research activity in the area of computer supported ECG processing. Analysis of heart electric field based on standard 12lead system is at present the most frequently used method of heart diseasediagnostics. However body surface potential mapping (BSPM that measures electric potentials from several tens to hundreds of electrodes placed on thorax surface has in certain cases higher diagnostic value given by data collection in areas that are inaccessible for standard 12lead ECG. For preprocessing, wavelet transform is used; it allows detect significant values on the ECG signal. Several types of maps, namely immediate potential, integral, isochronous, and differential.

  6. Stochastic Mode-Reduction in Models with Conservative Fast Sub-Systems

    OpenAIRE

    Jain, Ankita; Timofeyev, Ilya; Vanden-Eijnden, Eric

    2014-01-01

    A stochastic mode reduction strategy is applied to multiscale models with a deterministic energy-conserving fast sub-system. Specifically, we consider situations where the slow variables are driven stochastically and interact with the fast sub-system in an energy-conserving fashion. Since the stochastic terms only affect the slow variables, the fast-subsystem evolves deterministically on a sphere of constant energy. However, in the full model the radius of the sphere slowly changes due to the...

  7. The human operator transfer function: Identification of the limb mechanics subsystem

    Science.gov (United States)

    Jones, Lynette A.; Hunter, Ian W.

    1991-01-01

    The objective of our research is to decompose the performance of the human operator in terms of the subsystems that determine the operator's responses in order to establish how the dynamics of these component subsystems influence the operator's performance. In the present experiment, the dynamic stiffness of the human elbow joint was measured at rest and under different levels of biceps muscle activation; this work forms part of the analysis of the limb mechanics subsystem.

  8. Status of the Space Station water reclamation and management subsystem design concept

    Science.gov (United States)

    Bagdigian, R. M.; Mortazavi, P. L.

    1987-01-01

    A development status report is presented for the NASA Space Station's water reclamation and management (WRM) system, for which the candidate phase change-employing processing technologies are an air evaporation subsystem, a thermoelectric integrated membrane evaporation subsystem, and the vapor compression distillation subsystem. These WRM candidates employ evaporation to effect water removal from contaminants, but differ in their control of the vapor/liquid interface in zero-gravity and in the recovery of the latent heat of vaporization.

  9. Portable Life Support Subsystem Thermal Hydraulic Performance Analysis

    Science.gov (United States)

    Barnes, Bruce; Pinckney, John; Conger, Bruce

    2010-01-01

    This paper presents the current state of the thermal hydraulic modeling efforts being conducted for the Constellation Space Suit Element (CSSE) Portable Life Support Subsystem (PLSS). The goal of these efforts is to provide realistic simulations of the PLSS under various modes of operation. The PLSS thermal hydraulic model simulates the thermal, pressure, flow characteristics, and human thermal comfort related to the PLSS performance. This paper presents modeling approaches and assumptions as well as component model descriptions. Results from the models are presented that show PLSS operations at steady-state and transient conditions. Finally, conclusions and recommendations are offered that summarize results, identify PLSS design weaknesses uncovered during review of the analysis results, and propose areas for improvement to increase model fidelity and accuracy.

  10. Lacie phase 1 Classification and Mensuration Subsystem (CAMS) rework experiment

    Science.gov (United States)

    Chhikara, R. S.; Hsu, E. M.; Liszcz, C. J.

    1976-01-01

    An experiment was designed to test the ability of the Classification and Mensuration Subsystem rework operations to improve wheat proportion estimates for segments that had been processed previously. Sites selected for the experiment included three in Kansas and three in Texas, with the remaining five distributed in Montana and North and South Dakota. The acquisition dates were selected to be representative of imagery available in actual operations. No more than one acquisition per biophase were used, and biophases were determined by actual crop calendars. All sites were worked by each of four Analyst-Interpreter/Data Processing Analyst Teams who reviewed the initial processing of each segment and accepted or reworked it for an estimate of the proportion of small grains in the segment. Classification results, acquisitions and classification errors and performance results between CAMS regular and ITS rework are tabulated.

  11. Energy efficiency of different bus subsystems in Belgrade public transport

    Directory of Open Access Journals (Sweden)

    Mišanović Slobodan M.

    2015-01-01

    Full Text Available Research in this paper comprised experimental determination of energy efficiency of different bus subsystems (diesel bus, trolleybus and fully electric bus on chosen public transport route in Belgrade. Experimental measuring of energy efficiency of each bus type has been done based on the analysis of parameters of vehicle driving cycles between stops. Results of this analysis were basis for development of theoretical simulation model of energy efficiency. The model was latter compared with the results of simulation done by "Solaris bus & Coach" company for the chosen electric bus route. Based on demonstrated simulation, characteristics of electric bus batteries were defined, the method and dynamic of their recharge was suggested, as well as choice for other aggregates for drive system and technical characteristics for the electric buses were suggested.

  12. Progress report for the scintillator plate calorimeter subsystem

    Energy Technology Data Exchange (ETDEWEB)

    1990-12-31

    This report covers the work completed in FY90 by ANL staff and those of Westinghouse STC and BICRON Corporation under subcontract to ANL towards the design of a compensating calorimeter based on the use of scintillator plate as the sensitive medium. It is presented as five task sections dealing with respectively mechanical design; simulation studies; optical system design; electronics development; development of rad hard plastic scintillator and wavelength shifter and a summary. The work carried out by the University of Tennessee under a subcontract from ANL is reported separately. Finally, as principal institution with responsibility for the overall management of this subsystem effort, the summary here reports the conclusions resulting from the work of the collaboration and their impact on our proposed direction of effort in FY91. This proposal, for obvious reasons is given separately.

  13. Spread of entanglement for small subsystems in holographic CFTs

    CERN Document Server

    Kundu, Sandipan

    2016-01-01

    We develop an analytic perturbative expansion to study the propagation of entanglement entropy for small subsystems after a global quench, in the context of the AdS/CFT correspondence. Opposite to the large interval limit, in this case the evolution of the system takes place at timescales that are shorter in comparison to the local equilibration scale and thus, different physical mechanisms govern the dynamics and subsequent thermalization. In particular, we show that the heuristic picture in terms of a "entanglement tsunami" does not apply in this regime. We find two crucial differences. First, that the instantaneous rate of growth of the entanglement is not constrained by causality, but rather its time average. And second, that the approach to saturation is always continuous, regardless the shape of the entangling surface. Our analytic expansion also enables us to verify some previous numerical results, namely, that the saturation time is non-monotonic with respect to the chemical potential. All of our resu...

  14. A model for the emergence of adaptive subsystems.

    Science.gov (United States)

    Dopazo, H; Gordon, M B; Perazzo, R; Risau-Gusman, S

    2003-01-01

    We investigate the interaction of learning and evolution in a changing environment. A stable learning capability is regarded as an emergent adaptive system evolved by natural selection of genetic variants. We consider the evolution of an asexual population. Each genotype can have 'fixed' and 'flexible' alleles. The former express themselves as synaptic connections that remain unchanged during ontogeny and the latter as synapses that can be adjusted through a learning algorithm. Evolution is modelled using genetic algorithms and the changing environment is represented by two optimal synaptic patterns that alternate a fixed number of times during the 'life' of the individuals. The amplitude of the change is related to the Hamming distance between the two optimal patterns and the rate of change to the frequency with which both exchange roles. This model is an extension of that of Hinton and Nowlan in which the fitness is given by a probabilistic measure of the Hamming distance to the optimum. We find that two types of evolutionary pathways are possible depending upon how difficult (costly) it is to cope with the changes of the environment. In one case the population loses the learning ability, and the individuals inherit fixed synapses that are optimal in only one of the environmental states. In the other case a flexible subsystem emerges that allows the individuals to adapt to the changes of the environment. The model helps us to understand how an adaptive subsystem can emerge as the result of the tradeoff between the exploitation of a congenital structure and the exploration of the adaptive capabilities practised by learning.

  15. Solar Pilot Plant, Phase I. Preliminary design report. Volume V. Thermal storage subsystem. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Design, specifications, and diagrams for the thermal storage subsystem for the 10-MW pilot tower focus power plant are presented in detail. The Honeywell thermal storage subsystem design features a sensible heat storage arrangement using proven equipment and materials. The subsystem consists of a main storage containing oil and rock, two buried superheater tanks containing inorganic salts (Hitec), and the necessary piping, instrumentation, controls, and safety devices. The subsystem can provide 7 MW(e) for three hours after twenty hours of hold. It can be charged in approximately four hours. Storage for the commercial-scale plant consists of the same elements appropriately scaled up. Performance analysis and tradeoff studies are included.

  16. HTS filter and front-end subsystem for GSM1800 wireless base station

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The first HTS front-end subsystem for wireless base station in China was developed. This demonstration system, which aims at the application in GSM1800 mobile communication base station, consists of a single RF path, i.e. one filter and one LNA, integrated with the pulse tube cooler. The subsystem works at a pass band of 1710-1785 MHz with a gain of 18 dB and at a temperature of 70 K. The accomplishment of such a demonstration subsystem can boost the development of HTS commercial subsystem.

  17. Functional Performance of an Enabling Atmosphere Revitalization Subsystem Architecture for Deep Space Exploration Missions

    Science.gov (United States)

    Perry, Jay L.; Abney, Morgan B.; Frederick, Kenneth R.; Greenwood, Zachary W.; Kayatin, Matthew J.; Newton, Robert L.; Parrish, Keith J.; Roman, Monsi C.; Takada, Kevin C.; Miller, Lee A.; Scott, Joseph P.; Stanley, Christine M.

    2013-01-01

    A subsystem architecture derived from the International Space Station's (ISS) Atmosphere Revitalization Subsystem (ARS) has been functionally demonstrated. This ISS-derived architecture features re-arranged unit operations for trace contaminant control and carbon dioxide removal functions, a methane purification component as a precursor to enhance resource recovery over ISS capability, operational modifications to a water electrolysis-based oxygen generation assembly, and an alternative major atmospheric constituent monitoring concept. Results from this functional demonstration are summarized and compared to the performance observed during ground-based testing conducted on an ISS-like subsystem architecture. Considerations for further subsystem architecture and process technology development are discussed.

  18. THE ANALYSIS OF BEEF CATTLE SUBSYSTEM AGRIBUSINESS IMPLEMENTATION IN CENTRAL JAVA PROVINCE, INDONESIA

    Directory of Open Access Journals (Sweden)

    T. Ekowati

    2014-10-01

    Full Text Available The study aimed to analyze the implementation of subsystem agribusiness on the beef cattlefarming in Central Java. Five districts (Rembang, Blora, Grobogan, Boyolali and Wonogiri werepurposively chosen based on the value of Location Quotient (LQ. The study was conducted usingquota sampling method. Forty respondents of each district were chosen randomly using quota sampling.Data were analyzed through Structural Equation Model (SEM. The results showed that each subsystemagribusiness had adequate potential score. The score of 0.693, 0.721, 0.684, 0.626, and 0.691 were givenfor up-stream subsystem, on-farm, down-stream subsystem, marketing and supporting institution,respectively. The results showed that the SEM model was feasible with Chi-Square value=0.952;RMSEA=0.000; Probability =0.621 and TL1=1.126. The significant results of Critical Ratio (CR were:up-stream subsystem to the on-farm agribusiness; on-farm subsystem to down-stream agribusiness;down-stream subsystem to the farmer’s income; marketing subsystem to the up-stream agribusiness andSupporting Institution to the marketing subsystem and down-stream agribusiness. The conclusion ofresearch indicated that the implementation of beef cattle subsystem agribusiness had adequate index andgive positive effect to the beef cattle agribusiness.

  19. A Signaling Abnormal Handling Detection Method in IP Multimedia Subsystem Network%一种IMS网络信令异常处理检测方法

    Institute of Scientific and Technical Information of China (English)

    谢晓龙; 季新生; 刘彩霞; 刘树新

    2012-01-01

    针对IMS网络中被入侵或劫持的网络实体可能对信令消息做出恶意篡改等异常处理的问题,提出一种基于信令处理规则的IMS 网络信令异常处理检测方法.该方法基于建立的信令处理规则库,模拟实体对信令消息的正常处理并生成一条预处理消息,通过判断预处理消息与经过实体处理后的信令消息是否匹配,检测IMS网络中是否存在信令异常处理.实验结果表明,该方法对信令异常处理的检测率达到了100%.%In order to solve the problem that entities in IP Multimedia Subsystem(IMS) network hijacked by attackers may make exception handling to signaling messages, this paper proposes a detection method for signaling exception handling in IMS based on signaling handling rules. It simulates the normal handling of the entity to signaling messages and generates a pre-processing message based on the database of signaling handling rules, and then matching detection is made between the pre-processing message and the message handled by the entity to detect signaling exception handling in IMS. Experimental results prove that the detection rate of exception handling in IMS achieves 100%.

  20. Security Architecture and Key Technologies for IoT/CPS%IoT/CPS的安全体系结构及关键技术

    Institute of Scientific and Technical Information of China (English)

    丁超; 杨立君; 吴蒙

    2011-01-01

    Internet of Things (IoT) and Cyber-Physical Systems (CPS) are core technologies of next generation networks, and are the focus of research in both academia and industry. IoT/CPS has unique characteristics including heterogeneous integration, collaborative autonomy, and open interconnection that raise a number of issues for system security. These issues include seamless connection between security protocols, and preservation of user privacy. Developing novel security models,key technologies, and approaches is therefore critical in the development of IoT/CPS.This paper proposes an hierarchical security architecture based on threat analysis and security requirements and discusses key technologies associated with privacy preservation, secure control, and cross-network authentication.%物联网(IoT)和信息物理融合系统(CPS)作为下一代网络的核心技术,被业界广泛关注.与传统网络不同,IoT/CPS异构融合、协同自治、开放互连的网络特性带来了巨大的系统安全方面的挑战.挑战包括安全协议的无缝衔接、用户隐私保护等.研发新的安全模型、关键安全技术和方法是IoT/CPS发展中的重点.文章基于IoT/CPS安全需求和威胁模型,提出了一种层次化的安全体系结构,并针对隐私保护、跨网认证和安全控制等IoT/CPS的关键安全技术展开讨论.

  1. ITSG-Grace2016 data preprocessing methodologies revisited: impact of using Level-1A data products

    Science.gov (United States)

    Klinger, Beate; Mayer-Gürr, Torsten

    2017-04-01

    For the ITSG-Grace2016 release, the gravity field recovery is based on the use of official GRACE (Gravity Recovery and Climate Experiment) Level-1B data products, generated by the Jet Propulsion Laboratory (JPL). Before gravity field recovery, the Level-1B instrument data are preprocessed. This data preprocessing step includes the combination of Level-1B star camera (SCA1B) and angular acceleration (ACC1B) data for an improved attitude determination (sensor fusion), instrument data screening and ACC1B data calibration. Based on a Level-1A test dataset, provided for individual month throughout the GRACE period by the Center of Space Research at the University of Texas at Austin (UTCSR), the impact of using Level-1A instead of Level-1B data products within the ITSG-Grace2016 processing chain is analyzed. We discuss (1) the attitude determination through an optimal combination of SCA1A and ACC1A data using our sensor fusion approach, (2) the impact of the new attitude product on temporal gravity field solutions, and (3) possible benefits of using Level-1A data for instrument data screening and calibration. As the GRACE mission is currently reaching its end-of-life, the presented work aims not only at a better understanding of GRACE science data to reduce the impact of possible error sources on the gravity field recovery, but it also aims at preparing Level-1A data handling capabilities for the GRACE Follow-On mission.

  2. Experimental evaluation of video preprocessing algorithms for automatic target hand-off

    Science.gov (United States)

    McIngvale, P. H.; Guyton, R. D.

    It is pointed out that the Automatic Target Hand-Off Correlator (ATHOC) hardware has been modified to permit operation in a nonreal-time mode as a programmable laboratory test unit using video recordings as inputs and allowing several preprocessing algorithms to be software programmable. In parallel with this hardware modification effort, an analysis and simulation effort has been underway to help determine which of the many available preprocessing algorithms should be implemented in the ATHOC software. It is noted that videotapes from a current technology airborne target acquisition system and an imaging infrared missile seeker were recorded and used in the laboratory experiments. These experiments are described and the results are presented. A set of standard parameters is found for each case. Consideration of the background in the target scene is found to be important. Analog filter cutoff frequencies of 2.5 MHz for low pass and 300 kHz for high pass are found to give best results. EPNC = 1 is found to be slightly better than EPNC = 0. It is also shown that trilevel gives better results than bilevel.

  3. Automated cleaning and pre-processing of immunoglobulin gene sequences from high-throughput sequencing

    Directory of Open Access Journals (Sweden)

    Miri eMichaeli

    2012-12-01

    Full Text Available High throughput sequencing (HTS yields tens of thousands to millions of sequences that require a large amount of pre-processing work to clean various artifacts. Such cleaning cannot be performed manually. Existing programs are not suitable for immunoglobulin (Ig genes, which are variable and often highly mutated. This paper describes Ig-HTS-Cleaner (Ig High Throughput Sequencing Cleaner, a program containing a simple cleaning procedure that successfully deals with pre-processing of Ig sequences derived from HTS, and Ig-Indel-Identifier (Ig Insertion – Deletion Identifier, a program for identifying legitimate and artifact insertions and/or deletions (indels. Our programs were designed for analyzing Ig gene sequences obtained by 454 sequencing, but they are applicable to all types of sequences and sequencing platforms. Ig-HTS-Cleaner and Ig-Indel-Identifier have been implemented in Java and saved as executable JAR files, supported on Linux and MS Windows. No special requirements are needed in order to run the programs, except for correctly constructing the input files as explained in the text. The programs' performance has been tested and validated on real and simulated data sets.

  4. Preprocessing of A-scan GPR data based on energy features

    Science.gov (United States)

    Dogan, Mesut; Turhan-Sayan, Gonul

    2016-05-01

    There is an increasing demand for noninvasive real-time detection and classification of buried objects in various civil and military applications. The problem of detection and annihilation of landmines is particularly important due to strong safety concerns. The requirement for a fast real-time decision process is as important as the requirements for high detection rates and low false alarm rates. In this paper, we introduce and demonstrate a computationally simple, timeefficient, energy-based preprocessing approach that can be used in ground penetrating radar (GPR) applications to eliminate reflections from the air-ground boundary and to locate the buried objects, simultaneously, at one easy step. The instantaneous power signals, the total energy values and the cumulative energy curves are extracted from the A-scan GPR data. The cumulative energy curves, in particular, are shown to be useful to detect the presence and location of buried objects in a fast and simple way while preserving the spectral content of the original A-scan data for further steps of physics-based target classification. The proposed method is demonstrated using the GPR data collected at the facilities of IPA Defense, Ankara at outdoor test lanes. Cylindrically shaped plastic containers were buried in fine-medium sand to simulate buried landmines. These plastic containers were half-filled by ammonium nitrate including metal pins. Results of this pilot study are demonstrated to be highly promising to motivate further research for the use of energy-based preprocessing features in landmine detection problem.

  5. Selections of data preprocessing methods and similarity metrics for gene cluster analysis

    Institute of Scientific and Technical Information of China (English)

    YANG Chunmei; WAN Baikun; GAO Xiaofeng

    2006-01-01

    Clustering is one of the major exploratory techniques for gene expression data analysis. Only with suitable similarity metrics and when datasets are properly preprocessed, can results of high quality be obtained in cluster analysis. In this study, gene expression datasets with external evaluation criteria were preprocessed as normalization by line, normalization by column or logarithm transformation by base-2, and were subsequently clustered by hierarchical clustering, k-means clustering and self-organizing maps (SOMs) with Pearson correlation coefficient or Euclidean distance as similarity metric. Finally, the quality of clusters was evaluated by adjusted Rand index. The results illustrate that k-means clustering and SOMs have distinct advantages over hierarchical clustering in gene clustering, and SOMs are a bit better than k-means when randomly initialized. It also shows that hierarchical clustering prefers Pearson correlation coefficient as similarity metric and dataset normalized by line. Meanwhile, k-means clustering and SOMs can produce better clusters with Euclidean distance and logarithm transformed datasets. These results will afford valuable reference to the implementation of gene expression cluster analysis.

  6. A Technical Review on Biomass Processing: Densification, Preprocessing, Modeling and Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Jaya Shankar Tumuluru; Christopher T. Wright

    2010-06-01

    It is now a well-acclaimed fact that burning fossil fuels and deforestation are major contributors to climate change. Biomass from plants can serve as an alternative renewable and carbon-neutral raw material for the production of bioenergy. Low densities of 40–60 kg/m3 for lignocellulosic and 200–400 kg/m3 for woody biomass limits their application for energy purposes. Prior to use in energy applications these materials need to be densified. The densified biomass can have bulk densities over 10 times the raw material helping to significantly reduce technical limitations associated with storage, loading and transportation. Pelleting, briquetting, or extrusion processing are commonly used methods for densification. The aim of the present research is to develop a comprehensive review of biomass processing that includes densification, preprocessing, modeling and optimization. The specific objective include carrying out a technical review on (a) mechanisms of particle bonding during densification; (b) methods of densification including extrusion, briquetting, pelleting, and agglomeration; (c) effects of process and feedstock variables and biomass biochemical composition on the densification (d) effects of preprocessing such as grinding, preheating, steam explosion, and torrefaction on biomass quality and binding characteristics; (e) models for understanding the compression characteristics; and (f) procedures for response surface modeling and optimization.

  7. [Research on preprocessing method of near-infrared spectroscopy detection of coal ash calorific value].

    Science.gov (United States)

    Zhang, Lin; Lu, Hui-Shan; Yan, Hong-Wei; Gao, Qiang; Wang, Fu-Jie

    2013-12-01

    The calorific value of coal ash is an important indicator to evaluate the coal quality. In the experiment, the effect of spectrum and processing methods such as smoothing, differential processing, multiplicative scatter correction (MSC) and standard normal variate (SNV) in improving the near-infrared diffuse reflection spectrum signal-noise ratio was analyzed first, then partial least squares (PLS) and principal component analysis (PCR) were used to establish the calorific value model of coal ash for the spectrums processed with each preprocessing method respectively. It was found that the model performance can be obviously improved with 5-point smoothing processing, MSC and SNV, in which 5-point smoothing processing has the best effect, the coefficient of association, correction standard deviation and forecast standard deviation are respectively 0.9899, 0.00049 and 0.00052, and when 25-point smoothing processing is adopted, over-smoothing occurs, which worsens the model performance, while the model established with the spectrum after differential preprocessing has no obvious change and the influence on the model is not large.

  8. Satellite Dwarf Galaxies in a Hierarchical Universe: Infall Histories, Group Preprocessing, and Reionization

    CERN Document Server

    Wetzel, Andrew R; Garrison-Kimmel, Shea

    2015-01-01

    In the Local Group, almost all satellite dwarf galaxies that are within the virial radius of the Milky Way (MW) and M31 exhibit strong environmental influence. The orbital histories of these satellites provide the key to understanding the role of the MW/M31 halo, lower-mass groups, and cosmic reionization on the evolution of dwarf galaxies. We examine the virial-infall histories of satellites with M_star = 10 ^ {3 - 9} M_sun using the ELVIS suite of cosmological zoom-in dissipationless simulations of 48 MW/M31-like halos. Satellites at z = 0 fell into the MW/M31 halos typically 5 - 8 Gyr ago at z = 0.5 - 1. However, they first fell into any host halo typically 7 - 10 Gyr ago at z = 0.7 - 1.5. This difference arises because many satellites experienced "group preprocessing" in another host halo, typically of M_vir ~ 10 ^ {10 - 12} M_sun, before falling into the MW/M31 halos. Satellites with lower-mass and/or those closer to the MW/M31 fell in earlier and are more likely to have experienced group preprocessing; ...

  9. Tactile on-chip pre-processing with techniques from artificial retinas

    Science.gov (United States)

    Maldonado-Lopez, R.; Vidal-Verdu, F.; Linan, G.; Roca, E.; Rodriguez-Vazquez, A.

    2005-06-01

    The interest in tactile sensors is increasing as their use in complex unstructured environments is demanded, like in telepresence, minimal invasive surgery, robotics etc. The matrix of pressure data these devices provide can be managed with many image processing algorithms to extract the required information. However, as in the case of vision chips or artificial retinas, problems arise when the array size and the computation complexity increase. Having a look to the skin, the information collected by every mechanoreceptor is not carried to the brain for its processing, but some complex pre-processing is performed to fit the limited throughput of the nervous system. This is specially important for high bandwidth demanding tasks. Experimental works report that neural response of skin mechanoreceptors encodes the change in local shape from an offset level rather than the absolute force or pressure distributions. This is also the behavior of the retina, which implements a spatio-temporal averaging. We propose the same strategy in tactile preprocessing, and we show preliminary results when it faces the detection of the slip, which involves fast real-time processing.

  10. Penggunaan Web Crawler Untuk Menghimpun Tweets dengan Metode Pre-Processing Text Mining

    Directory of Open Access Journals (Sweden)

    Bayu Rima Aditya

    2015-11-01

    Full Text Available Saat ini jumlah data di media sosial sudah terbilang sangat besar, namun jumlah data tersebut masih belum banyak dimanfaatkan atau diolah untuk menjadi sesuatu yang bernilai guna, salah satunya adalah tweets pada media sosial twitter. Paper ini menguraikan hasil penggunaan engine web crawel menggunakan metode pre-processing text mining. Penggunaan engine web crawel itu sendiri bertujuan untuk menghimpun tweets melalui API twitter sebagai data teks tidak terstruktur yang kemudian direpresentasikan kembali kedalam bentuk web. Sedangkan penggunaan metode pre-processing bertujuan untuk menyaring tweets melalui tiga tahap, yaitu cleansing, case folding, dan parsing. Aplikasi yang dirancang pada penelitian ini menggunakan metode pengembangan perangkat lunak yaitu model waterfall dan diimplementasikan dengan bahasa pemrograman PHP. Sedangkan untuk pengujiannya menggunakan black box testing untuk memeriksa apakah hasil perancangan sudah dapat berjalan sesuai dengan harapan atau belum. Hasil dari penelitian ini adalah berupa aplikasi yang dapat mengubah tweets yang telah dihimpun menjadi data yang siap diolah lebih lanjut sesuai dengan kebutuhan user berdasarkan kata kunci dan tanggal pencarian. Hal ini dilakukan karena dari beberapa penelitian terkait terlihat bahwa data pada media sosial khususnya twitter saat ini menjadi tujuan perusahaan atau instansi untuk memahami opini masyarakat

  11. Review of Intelligent Techniques Applied for Classification and Preprocessing of Medical Image Data

    Directory of Open Access Journals (Sweden)

    H S Hota

    2013-01-01

    Full Text Available Medical image data like ECG, EEG and MRI, CT-scan images are the most important way to diagnose disease of human being in precise way and widely used by the physician. Problem can be clearly identified with the help of these medical images. A robust model can classify the medical image data in better way .In this paper intelligent techniques like neural network and fuzzy logic techniques are explored for MRI medical image data to identify tumor in human brain. Also need of preprocessing of medical image data is explored. Classification technique has been used extensively in the field of medical imaging. The conventional method in medical science for medical image data classification is done by human inspection which may result misclassification of data sometime this type of problem identification are impractical for large amounts of data and noisy data, a noisy data may be produced due to some technical fault of the machine or by human errors and can lead misclassification of medical image data. We have collected number of papers based on neural network and fuzzy logic along with hybrid technique to explore the efficiency and robustness of the model for brain MRI data. It has been analyzed that intelligent model along with data preprocessing using principal component analysis (PCA and segmentation may be the competitive model in this domain.

  12. Statistical Downscaling Output GCM Modeling with Continuum Regression and Pre-Processing PCA Approach

    Directory of Open Access Journals (Sweden)

    Sutikno Sutikno

    2010-08-01

    Full Text Available One of the climate models used to predict the climatic conditions is Global Circulation Models (GCM. GCM is a computer-based model that consists of different equations. It uses numerical and deterministic equation which follows the physics rules. GCM is a main tool to predict climate and weather, also it uses as primary information source to review the climate change effect. Statistical Downscaling (SD technique is used to bridge the large-scale GCM with a small scale (the study area. GCM data is spatial and temporal data most likely to occur where the spatial correlation between different data on the grid in a single domain. Multicollinearity problems require the need for pre-processing of variable data X. Continuum Regression (CR and pre-processing with Principal Component Analysis (PCA methods is an alternative to SD modelling. CR is one method which was developed by Stone and Brooks (1990. This method is a generalization from Ordinary Least Square (OLS, Principal Component Regression (PCR and Partial Least Square method (PLS methods, used to overcome multicollinearity problems. Data processing for the station in Ambon, Pontianak, Losarang, Indramayu and Yuntinyuat show that the RMSEP values and R2 predict in the domain 8x8 and 12x12 by uses CR method produces results better than by PCR and PLS.

  13. Fast data preprocessing with Graphics Processing Units for inverse problem solving in light-scattering measurements

    Science.gov (United States)

    Derkachov, G.; Jakubczyk, T.; Jakubczyk, D.; Archer, J.; Woźniak, M.

    2017-07-01

    Utilising Compute Unified Device Architecture (CUDA) platform for Graphics Processing Units (GPUs) enables significant reduction of computation time at a moderate cost, by means of parallel computing. In the paper [Jakubczyk et al., Opto-Electron. Rev., 2016] we reported using GPU for Mie scattering inverse problem solving (up to 800-fold speed-up). Here we report the development of two subroutines utilising GPU at data preprocessing stages for the inversion procedure: (i) A subroutine, based on ray tracing, for finding spherical aberration correction function. (ii) A subroutine performing the conversion of an image to a 1D distribution of light intensity versus azimuth angle (i.e. scattering diagram), fed from a movie-reading CPU subroutine running in parallel. All subroutines are incorporated in PikeReader application, which we make available on GitHub repository. PikeReader returns a sequence of intensity distributions versus a common azimuth angle vector, corresponding to the recorded movie. We obtained an overall ∼ 400 -fold speed-up of calculations at data preprocessing stages using CUDA codes running on GPU in comparison to single thread MATLAB-only code running on CPU.

  14. Evaluation of preprocessing, mapping and postprocessing algorithms for analyzing whole genome bisulfite sequencing data.

    Science.gov (United States)

    Tsuji, Junko; Weng, Zhiping

    2016-11-01

    Cytosine methylation regulates many biological processes such as gene expression, chromatin structure and chromosome stability. The whole genome bisulfite sequencing (WGBS) technique measures the methylation level at each cytosine throughout the genome. There are an increasing number of publicly available pipelines for analyzing WGBS data, reflecting many choices of read mapping algorithms as well as preprocessing and postprocessing methods. We simulated single-end and paired-end reads based on three experimental data sets, and comprehensively evaluated 192 combinations of three preprocessing, five postprocessing and five widely used read mapping algorithms. We also compared paired-end data with single-end data at the same sequencing depth for performance of read mapping and methylation level estimation. Bismark and LAST were the most robust mapping algorithms. We found that Mott trimming and quality filtering individually improved the performance of both read mapping and methylation level estimation, but combining them did not lead to further improvement. Furthermore, we confirmed that paired-end sequencing reduced error rate and enhanced sensitivity for both read mapping and methylation level estimation, especially for short reads and in repetitive regions of the human genome.

  15. Data Acquisition and Preprocessing in Studies on Humans: What Is Not Taught in Statistics Classes?

    Science.gov (United States)

    Zhu, Yeyi; Hernandez, Ladia M; Mueller, Peter; Dong, Yongquan; Forman, Michele R

    2013-01-01

    The aim of this paper is to address issues in research that may be missing from statistics classes and important for (bio-)statistics students. In the context of a case study, we discuss data acquisition and preprocessing steps that fill the gap between research questions posed by subject matter scientists and statistical methodology for formal inference. Issues include participant recruitment, data collection training and standardization, variable coding, data review and verification, data cleaning and editing, and documentation. Despite the critical importance of these details in research, most of these issues are rarely discussed in an applied statistics program. One reason for the lack of more formal training is the difficulty in addressing the many challenges that can possibly arise in the course of a study in a systematic way. This article can help to bridge this gap between research questions and formal statistical inference by using an illustrative case study for a discussion. We hope that reading and discussing this paper and practicing data preprocessing exercises will sensitize statistics students to these important issues and achieve optimal conduct, quality control, analysis, and interpretation of a study.

  16. A data preprocessing strategy for metabolomics to reduce the mask effect in data analysis.

    Science.gov (United States)

    Yang, Jun; Zhao, Xinjie; Lu, Xin; Lin, Xiaohui; Xu, Guowang

    2015-01-01

    HighlightsDeveloped a data preprocessing strategy to cope with missing values and mask effects in data analysis from high variation of abundant metabolites.A new method- 'x-VAST' was developed to amend the measurement deviation enlargement.Applying the above strategy, several low abundant masked differential metabolites were rescued. Metabolomics is a booming research field. Its success highly relies on the discovery of differential metabolites by comparing different data sets (for example, patients vs. controls). One of the challenges is that differences of the low abundant metabolites between groups are often masked by the high variation of abundant metabolites. In order to solve this challenge, a novel data preprocessing strategy consisting of three steps was proposed in this study. In step 1, a 'modified 80%' rule was used to reduce effect of missing values; in step 2, unit-variance and Pareto scaling methods were used to reduce the mask effect from the abundant metabolites. In step 3, in order to fix the adverse effect of scaling, stability information of the variables deduced from intensity information and the class information, was used to assign suitable weights to the variables. When applying to an LC/MS based metabolomics dataset from chronic hepatitis B patients study and two simulated datasets, the mask effect was found to be partially eliminated and several new low abundant differential metabolites were rescued.

  17. Effective Preprocessing Procedures Virtually Eliminate Distance-Dependent Motion Artifacts in Resting State FMRI.

    Science.gov (United States)

    Jo, Hang Joon; Gotts, Stephen J; Reynolds, Richard C; Bandettini, Peter A; Martin, Alex; Cox, Robert W; Saad, Ziad S

    2013-05-21

    Artifactual sources of resting-state (RS) FMRI can originate from head motion, physiology, and hardware. Of these sources, motion has received considerable attention and was found to induce corrupting effects by differentially biasing correlations between regions depending on their distance. Numerous corrective approaches have relied on the identification and censoring of high-motion time points and the use of the brain-wide average time series as a nuisance regressor to which the data are orthogonalized (Global Signal Regression, GSReg). We first replicate the previously reported head-motion bias on correlation coefficients using data generously contributed by Power et al. (2012). We then show that while motion can be the source of artifact in correlations, the distance-dependent bias-taken to be a manifestation of the motion effect on correlation-is exacerbated by the use of GSReg. Put differently, correlation estimates obtained after GSReg are more susceptible to the presence of motion and by extension to the levels of censoring. More generally, the effect of motion on correlation estimates depends on the preprocessing steps leading to the correlation estimate, with certain approaches performing markedly worse than others. For this purpose, we consider various models for RS FMRI preprocessing and show that WMeLOCAL, as subset of the ANATICOR discussed by Jo et al. (2010), denoising approach results in minimal sensitivity to motion and reduces by extension the dependence of correlation results on censoring.

  18. Data preprocessing method for liquid chromatography-mass spectrometry based metabolomics.

    Science.gov (United States)

    Wei, Xiaoli; Shi, Xue; Kim, Seongho; Zhang, Li; Patrick, Jeffrey S; Binkley, Joe; McClain, Craig; Zhang, Xiang

    2012-09-18

    A set of data preprocessing algorithms for peak detection and peak list alignment are reported for analysis of liquid chromatography-mass spectrometry (LC-MS)-based metabolomics data. For spectrum deconvolution, peak picking is achieved at the selected ion chromatogram (XIC) level. To estimate and remove the noise in XICs, each XIC is first segmented into several peak groups based on the continuity of scan number, and the noise level is estimated by all the XIC signals, except the regions potentially with presence of metabolite ion peaks. After removing noise, the peaks of molecular ions are detected using both the first and the second derivatives, followed by an efficient exponentially modified Gaussian-based peak deconvolution method for peak fitting. A two-stage alignment algorithm is also developed, where the retention times of all peaks are first transferred into the z-score domain and the peaks are aligned based on the measure of their mixture scores after retention time correction using a partial linear regression. Analysis of a set of spike-in LC-MS data from three groups of samples containing 16 metabolite standards mixed with metabolite extract from mouse livers demonstrates that the developed data preprocessing method performs better than two of the existing popular data analysis packages, MZmine2.6 and XCMS(2), for peak picking, peak list alignment, and quantification.

  19. A Lightweight Data Preprocessing Strategy with Fast Contradiction Analysis for Incremental Classifier Learning

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2015-01-01

    Full Text Available A prime objective in constructing data streaming mining models is to achieve good accuracy, fast learning, and robustness to noise. Although many techniques have been proposed in the past, efforts to improve the accuracy of classification models have been somewhat disparate. These techniques include, but are not limited to, feature selection, dimensionality reduction, and the removal of noise from training data. One limitation common to all of these techniques is the assumption that the full training dataset must be applied. Although this has been effective for traditional batch training, it may not be practical for incremental classifier learning, also known as data stream mining, where only a single pass of the data stream is seen at a time. Because data streams can amount to infinity and the so-called big data phenomenon, the data preprocessing time must be kept to a minimum. This paper introduces a new data preprocessing strategy suitable for the progressive purging of noisy data from the training dataset without the need to process the whole dataset at one time. This strategy is shown via a computer simulation to provide the significant benefit of allowing for the dynamic removal of bad records from the incremental classifier learning process.

  20. Robust symmetrical number system preprocessing for minimizing encoding errors in photonic analog-to-digital converters

    Science.gov (United States)

    Arvizo, Mylene R.; Calusdian, James; Hollinger, Kenneth B.; Pace, Phillip E.

    2011-08-01

    A photonic analog-to-digital converter (ADC) preprocessing architecture based on the robust symmetrical number system (RSNS) is presented. The RSNS preprocessing architecture is a modular scheme in which a modulus number of comparators are used at the output of each Mach-Zehnder modulator channel. The number of comparators with a logic 1 in each channel represents the integer values within each RSNS modulus sequence. When considered together, the integers within each sequence change one at a time at the next code position, resulting in an integer Gray code property. The RSNS ADC has the feature that the maximum nonlinearity is less than a least significant bit (LSB). Although the observed dynamic range (greatest length of combined sequences that contain no ambiguities) of the RSNS ADC is less than the optimum symmetrical number system ADC, the integer Gray code properties make it attractive for error control. A prototype is presented to demonstrate the feasibility of the concept and to show the important RSNS property that the largest nonlinearity is always less than a LSB. Also discussed are practical considerations related to multi-gigahertz implementations.

  1. Effective Preprocessing Procedures Virtually Eliminate Distance-Dependent Motion Artifacts in Resting State FMRI

    Directory of Open Access Journals (Sweden)

    Hang Joon Jo

    2013-01-01

    Full Text Available Artifactual sources of resting-state (RS FMRI can originate from head motion, physiology, and hardware. Of these sources, motion has received considerable attention and was found to induce corrupting effects by differentially biasing correlations between regions depending on their distance. Numerous corrective approaches have relied on the identification and censoring of high-motion time points and the use of the brain-wide average time series as a nuisance regressor to which the data are orthogonalized (Global Signal Regression, GSReg. We replicate the previously reported head-motion bias on correlation coefficients and then show that while motion can be the source of artifact in correlations, the distance-dependent bias is exacerbated by GSReg. Put differently, correlation estimates obtained after GSReg are more susceptible to the presence of motion and by extension to the levels of censoring. More generally, the effect of motion on correlation estimates depends on the preprocessing steps leading to the correlation estimate, with certain approaches performing markedly worse than others. For this purpose, we consider various models for RS FMRI preprocessing and show that the local white matter regressor (WMeLOCAL, a subset of ANATICOR, results in minimal sensitivity to motion and reduces by extension the dependence of correlation results on censoring.

  2. MODIStsp: An R package for automatic preprocessing of MODIS Land Products time series

    Science.gov (United States)

    Busetto, L.; Ranghetti, L.

    2016-12-01

    MODIStsp is a new R package allowing automating the creation of raster time series derived from MODIS Land Products. It allows performing several preprocessing steps (e.g. download, mosaicing, reprojection and resize) on MODIS products on a selected time period and area. All processing parameters can be set with a user-friendly GUI, allowing users to select which specific layers of the original MODIS HDF files have to be processed and which Quality Indicators have to be extracted from the aggregated MODIS Quality Assurance layers. Moreover, the tool allows on-the-fly computation of time series of Spectral Indexes (either standard or custom-specified by the user through the GUI) from surface reflectance bands. Outputs are saved as single-band rasters corresponding to each available acquisition date and output layer. Virtual files allowing easy access to the entire time series as a single file using common image processing/GIS software or R scripts can be also created. Non-interactive execution within an R script and stand-alone execution outside an R environment exploiting a previously created Options File are also possible, the latter allowing scheduling execution of MODIStsp to automatically update a time series when a new image is available. The proposed software constitutes a very useful tool for the Remote Sensing community, since it allows performing all the main preprocessing steps required for the creation of time series of MODIS data within a common framework, and without requiring any particular programming skills by its users.

  3. 基于CPS技术的智慧校园平台设计%Design on the CPS-based Smart Campus

    Institute of Scientific and Technical Information of China (English)

    陈华鹏; 林杰

    2014-01-01

    CPS即物理信息系统,是一种对新型的计算方式的描述。在现实生活环境中已经有不少物理信息系统的雏形。根据CPS技术,可以将现有的校园内资源整合起来,形成具有智能化的校园服务平台,并在此平台上形成诸多智慧校园的应用。%CPS which is an acronomy for“Cyber Physic System”is a new kind of computational architecture. In matter of fact, there are many CPS-like systems in our daily life. The rich resource can be integrated by using CPS so that we are able to build a platform of intelligent service in campus and various of applications of smart campus.

  4. CPS Architecture of Identifier-based Universal Networks%面向CPS的一体化标识网络结构研究

    Institute of Scientific and Technical Information of China (English)

    申聪聪; 戴超凡

    2012-01-01

    对信息与物理融合系统(Cyber—Physical System,CPS)体系结构及其标识机制进行了系统分析,对基于资源港(Port)的体系结构进行了设计,提出了三层一体化标识网络结构模型,并设计Port层模型作为CPS资源寻址定位和建立通信连接的重要中间件层。最后,以一个智能家居原型系统为例,对一体化标识结构关于支持用户对CPS资源访问与管控的可行性进行了验证。%This paper analyzes the system structure and identifier mechanism of Cyber-Physical System (CPS) , designs the system structure based on the resource port, and proposes three layers' architecture model of identifier-based universal networks with each layer analyzed and described deeply. As an important intermediate layer of CPS resource addressing and communication, Port layer is designed. Finally, taken the smart home prototype system as an example, the feasibility of the CPS resource addressing and communication by the consumer is validated and supported by the identifier-based universal networks architecture frame.

  5. A preprocessing tool for removing artifact from cardiac RR interval recordings using three-dimensional spatial distribution mapping.

    Science.gov (United States)

    Stapelberg, Nicolas J C; Neumann, David L; Shum, David H K; McConnell, Harry; Hamilton-Craig, Ian

    2016-04-01

    Artifact is common in cardiac RR interval data that is recorded for heart rate variability (HRV) analysis. A novel algorithm for artifact detection and interpolation in RR interval data is described. It is based on spatial distribution mapping of RR interval magnitude and relationships to adjacent values in three dimensions. The characteristics of normal physiological RR intervals and artifact intervals were established using 24-h recordings from 20 technician-assessed human cardiac recordings. The algorithm was incorporated into a preprocessing tool and validated using 30 artificial RR (ARR) interval data files, to which known quantities of artifact (0.5%, 1%, 2%, 3%, 5%, 7%, 10%) were added. The impact of preprocessing ARR files with 1% added artifact was also assessed using 10 time domain and frequency domain HRV metrics. The preprocessing tool was also used to preprocess 69 24-h human cardiac recordings. The tool was able to remove artifact from technician-assessed human cardiac recordings (sensitivity 0.84, SD = 0.09, specificity of 1.00, SD = 0.01) and artificial data files. The removal of artifact had a low impact on time domain and frequency domain HRV metrics (ranging from 0% to 2.5% change in values). This novel preprocessing tool can be used with human 24-h cardiac recordings to remove artifact while minimally affecting physiological data and therefore having a low impact on HRV measures of that data.

  6. Increasing conclusiveness of metabonomic studies by chem-informatic preprocessing of capillary electrophoretic data on urinary nucleoside profiles.

    Science.gov (United States)

    Szymańska, E; Markuszewski, M J; Capron, X; van Nederkassel, A-M; Heyden, Y Vander; Markuszewski, M; Krajka, K; Kaliszan, R

    2007-01-17

    Nowadays, bioinformatics offers advanced tools and procedures of data mining aimed at finding consistent patterns or systematic relationships between variables. Numerous metabolites concentrations can readily be determined in a given biological system by high-throughput analytical methods. However, such row analytical data comprise noninformative components due to many disturbances normally occurring in analysis of biological samples. To eliminate those unwanted original analytical data components advanced chemometric data preprocessing methods might be of help. Here, such methods are applied to electrophoretic nucleoside profiles in urine samples of cancer patients and healthy volunteers. The electrophoretic nucleoside profiles were obtained under following conditions: 100 mM borate, 72.5 mM phosphate, 160 mM SDS, pH 6.7; 25 kV voltage, 30 degrees C temperature; untreated fused silica capillary 70 cm effective length, 50 microm I.D. Different most advanced preprocessing tools were applied for baseline correction, denoising and alignment of electrophoretic data. That approach was compared to standard procedure of electrophoretic peak integration. The best results of preprocessing were obtained after application of the so-called correlation optimized warping (COW) to align the data. The principal component analysis (PCA) of preprocessed data provides a clearly better consistency of the nucleoside electrophoretic profiles with health status of subjects than PCA of peak areas of original data (without preprocessing).

  7. Evaluating the reliability of different preprocessing steps to estimate graph theoretical measures in resting state fMRI data.

    Science.gov (United States)

    Aurich, Nathassia K; Alves Filho, José O; Marques da Silva, Ana M; Franco, Alexandre R

    2015-01-01

    With resting-state functional MRI (rs-fMRI) there are a variety of post-processing methods that can be used to quantify the human brain connectome. However, there is also a choice of which preprocessing steps will be used prior to calculating the functional connectivity of the brain. In this manuscript, we have tested seven different preprocessing schemes and assessed the reliability between and reproducibility within the various strategies by means of graph theoretical measures. Different preprocessing schemes were tested on a publicly available dataset, which includes rs-fMRI data of healthy controls. The brain was parcellated into 190 nodes and four graph theoretical (GT) measures were calculated; global efficiency (GEFF), characteristic path length (CPL), average clustering coefficient (ACC), and average local efficiency (ALE). Our findings indicate that results can significantly differ based on which preprocessing steps are selected. We also found dependence between motion and GT measurements in most preprocessing strategies. We conclude that by using censoring based on outliers within the functional time-series as a processing, results indicate an increase in reliability of GT measurements with a reduction of the dependency of head motion.

  8. A Method for Determining the Reliability Index of a Materiel Subsystem

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a method for determining the reliability index of a subsystem in the materiel demonstration phase -AHP Failure Rate Method. It fully considers the various factors which influence a subsystem reliability index and solves a difficult problem that puz zles the demonstration personnel.

  9. The Organization of the Distance Teaching Sub-System in an Open University.

    Science.gov (United States)

    Chacon, Fabio J.

    The problem of finding an adequate organization for the distance teaching subsystem in the Open University of Venezuela (Universidad Nacional Abierta) is analyzed. Problems facing this subsystem concern: communications with the headquarters and within the learning centers network, interaction with the environment in order to create a favorable…

  10. Internet Use and Child Development: Validation of the Ecological Techno-Subsystem

    Science.gov (United States)

    Johnson, Genevieve Marie

    2010-01-01

    Johnson and Puplampu recently proposed the "ecological techno-subsystem", a refinement to Bronfenbrenner's theoretical organization of environmental influences on child development. The ecological techno-subsystem includes child interaction with both living (e.g., peers) and nonliving (e.g., hardware) elements of communication,…

  11. Owens-Illinois subsystem design package for the SEC-601 air-cooled solar collector

    Science.gov (United States)

    1979-01-01

    The subsystem design of the SEC-601 solar collector was evaluated. The collector is of modular design and is approximately 12 feet three inches wide and eight feet seven inches tall. It contains 72 collector tube elements and weighs approximately 300 pounds. Included in this report are the subsystem performance specifications and the assembly and installation drawings of the solar collectors and manifold.

  12. BEHAVE: fire behavior prediction and fuel modeling system-BURN Subsystem, part 1

    Science.gov (United States)

    Patricia L. Andrews

    1986-01-01

    Describes BURN Subsystem, Part 1, the operational fire behavior prediction subsystem of the BEHAVE fire behavior prediction and fuel modeling system. The manual covers operation of the computer program, assumptions of the mathematical models used in the calculations, and application of the predictions.

  13. Optimization of a thermoelectric generator subsystem for high temperature PEM fuel cell exhaust heat recovery

    DEFF Research Database (Denmark)

    Gao, Xin; Andreasen, Søren Juhl; Kær, Søren Knudsen

    2014-01-01

    In previous work, a thermoelectric (TE) exhaust heat recovery subsystem for a high temperature polymer electrolyte membrane (HT-PEM) fuel cell stack was developed and modeled. Numerical simulations were conducted and have identified an optimized subsystem configuration and 4 types of compact heat...

  14. Variants of Topology Editing Strategy in the Subsystem of Printed Circuit Boards Manufacturability Improvement

    OpenAIRE

    Roman Panchak; Konstantyn Kolesnyk; Marian Lobur

    2011-01-01

    This paper focuses on the variants of printed circuit boards topology editing strategies implemented in the subsystem of automatic printed circuit boards topology editing. Depending on the requirements for printed circuit board topology, a subsystem user can create variants of editing strategies in order to minimize the amount of the technologically justified places with a minimum clearance between the elements of the topology.

  15. Internet Use and Child Development: Validation of the Ecological Techno-Subsystem

    Science.gov (United States)

    Johnson, Genevieve Marie

    2010-01-01

    Johnson and Puplampu recently proposed the "ecological techno-subsystem", a refinement to Bronfenbrenner's theoretical organization of environmental influences on child development. The ecological techno-subsystem includes child interaction with both living (e.g., peers) and nonliving (e.g., hardware) elements of communication,…

  16. Internet Use and Child Development: Validation of the Ecological Techno-Subsystem

    Science.gov (United States)

    Johnson, Genevieve Marie

    2010-01-01

    Johnson and Puplampu recently proposed the "ecological techno-subsystem", a refinement to Bronfenbrenner's theoretical organization of environmental influences on child development. The ecological techno-subsystem includes child interaction with both living (e.g., peers) and nonliving (e.g., hardware) elements of communication, information, and…

  17. Selected Lessons Learned in Space Shuttle Orbiter Propulsion and Power Subsystems

    Science.gov (United States)

    Hernandez, Francisco J.; Martinez, Hugo; Ryan, Abigail; Westover, Shayne; Davies, Frank

    2011-01-01

    Over its 30 years of space flight history, plus the nearly 10 years of design, development test and evaluation, the Space Shuttle Orbiter is full of lessons learned in all of its numerous and complex subsystems. In the current paper, only selected lessons learned in the areas of the Orbiter propulsion and power subsystems will be described. The particular Orbiter subsystems include: Auxiliary Power Unit (APU), Hydraulics and Water Spray Boiler (WSB), Mechanical Flight Controls, Main Propulsion System (MPS), Fuel Cells and Power Reactant and Storage Devices (PRSD), Orbital Maneuvering System (OMS), Reaction Control System (RCS), Electrical Power Distribution (EPDC), electrical wiring and pyrotechnics. Given the complexity and extensive history of each of these subsystems, and the limited scope of this paper, it is impossible to include most of the lessons learned; instead the attempt will be to present a selected few or key lessons, in the judgment of the authors. Each subsystem is presented separate, beginning with an overview of the hardware and their function, a short description of a few historical problems and their lessons, followed by a more comprehensive table listing of the major subsystem problems and lessons. These tables serve as a quick reference for lessons learned in each subsystem. In addition, this paper will establish common lessons across subsystems as well as concentrate on those lessons which are deemed to have the highest applicability to future space flight programs.

  18. On image pre-processing for PIV of single- and two-phase flows over reflecting objects

    Energy Technology Data Exchange (ETDEWEB)

    Deen, Niels G.; Willems, Paul; Sint Annaland, Martin van; Kuipers, J.A.M.; Lammertink, Rob G.H.; Kemperman, Antoine J.B.; Wessling, Matthias; Meer, Walter G.J. van der [University of Twente, Faculty of Science and Technology, Institute of Mechanics, Processes and Control Twente (IMPACT), Enschede (Netherlands)

    2010-08-15

    A novel image pre-processing scheme for PIV of single- and two-phase flows over reflecting objects which does not require the use of additional hardware is discussed. The approach for single-phase flow consists of image normalization and intensity stretching followed by background subtraction. For two-phase flow, an additional masking step is added after the background subtraction. The effectiveness of the pre-processing scheme is shown for two examples: PIV of single-phase flow in spacer-filled channels and two-phase flow in these channels. The pre-processing scheme increased the displacement peak detectability significantly and produced high quality vector fields, without the use of additional hardware. (orig.)

  19. A simpler method of preprocessing MALDI-TOF MS data for differential biomarker analysis: stem cell and melanoma cancer studies

    Directory of Open Access Journals (Sweden)

    Tong Dong L

    2011-09-01

    Full Text Available Abstract Introduction Raw spectral data from matrix-assisted laser desorption/ionisation time-of-flight (MALDI-TOF with MS profiling techniques usually contains complex information not readily providing biological insight into disease. The association of identified features within raw data to a known peptide is extremely difficult. Data preprocessing to remove uncertainty characteristics in the data is normally required before performing any further analysis. This study proposes an alternative yet simple solution to preprocess raw MALDI-TOF-MS data for identification of candidate marker ions. Two in-house MALDI-TOF-MS data sets from two different sample sources (melanoma serum and cord blood plasma are used in our study. Method Raw MS spectral profiles were preprocessed using the proposed approach to identify peak regions in the spectra. The preprocessed data was then analysed using bespoke machine learning algorithms for data reduction and ion selection. Using the selected ions, an ANN-based predictive model was constructed to examine the predictive power of these ions for classification. Results Our model identified 10 candidate marker ions for both data sets. These ion panels achieved over 90% classification accuracy on blind validation data. Receiver operating characteristics analysis was performed and the area under the curve for melanoma and cord blood classifiers was 0.991 and 0.986, respectively. Conclusion The results suggest that our data preprocessing technique removes unwanted characteristics of the raw data, while preserving the predictive components of the data. Ion identification analysis can be carried out using MALDI-TOF-MS data with the proposed data preprocessing technique coupled with bespoke algorithms for data reduction and ion selection.

  20. Preprocessing and Quality Control Strategies for Illumina DASL Assay-Based Brain Gene Expression Studies with Semi-Degraded Samples.

    Science.gov (United States)

    Chow, Maggie L; Winn, Mary E; Li, Hai-Ri; April, Craig; Wynshaw-Boris, Anthony; Fan, Jian-Bing; Fu, Xiang-Dong; Courchesne, Eric; Schork, Nicholas J

    2012-01-01

    Available statistical preprocessing or quality control analysis tools for gene expression microarray datasets are known to greatly affect downstream data analysis, especially when degraded samples, unique tissue samples, or novel expression assays are used. It is therefore important to assess the validity and impact of the assumptions built in to preprocessing schemes for a dataset. We developed and assessed a data preprocessing strategy for use with the Illumina DASL-based gene expression assay with partially degraded postmortem prefrontal cortex samples. The samples were obtained from individuals with autism as part of an investigation of the pathogenic factors contributing to autism. Using statistical analysis methods and metrics such as those associated with multivariate distance matrix regression and mean inter-array correlation, we developed a DASL-based assay gene expression preprocessing pipeline to accommodate and detect problems with microarray-based gene expression values obtained with degraded brain samples. Key steps in the pipeline included outlier exclusion, data transformation and normalization, and batch effect and covariate corrections. Our goal was to produce a clean dataset for subsequent downstream differential expression analysis. We ultimately settled on available transformation and normalization algorithms in the R/Bioconductor package lumi based on an assessment of their use in various combinations. A log2-transformed, quantile-normalized, and batch and seizure-corrected procedure was likely the most appropriate for our data. We empirically tested different components of our proposed preprocessing strategy and believe that our results suggest that a preprocessing strategy that effectively identifies outliers, normalizes the data, and corrects for batch effects can be applied to all studies, even those pursued with degraded samples.

  1. A Subsystem Test Bed for Chinese Spectral Radioheliograph

    Science.gov (United States)

    Zhao, An; Yan, Yihua; Wang, Wei

    2014-11-01

    The Chinese Spectral Radioheliograph is a solar dedicated radio interferometric array that will produce high spatial resolution, high temporal resolution, and high spectral resolution images of the Sun simultaneously in decimetre and centimetre wave range. Digital processing of intermediate frequency signal is an important part in a radio telescope. This paper describes a flexible and high-speed digital down conversion system for the CSRH by applying complex mixing, parallel filtering, and extracting algorithms to process IF signal at the time of being designed and incorporates canonic-signed digit coding and bit-plane method to improve program efficiency. The DDC system is intended to be a subsystem test bed for simulation and testing for CSRH. Software algorithms for simulation and hardware language algorithms based on FPGA are written which use less hardware resources and at the same time achieve high performances such as processing high-speed data flow (1 GHz) with 10 MHz spectral resolution. An experiment with the test bed is illustrated by using geostationary satellite data observed on March 20, 2014. Due to the easy alterability of the algorithms on FPGA, the data can be recomputed with different digital signal processing algorithms for selecting optimum algorithm.

  2. Custom electronic subsystems for the laboratory telerobotic manipulator

    Science.gov (United States)

    Glassell, R. L.; Butler, P. L.; Rowe, J. C.; Zimmermann, S. D.

    1990-01-01

    The National Aeronautics and Space Administration (NASA) Space Station Program presents new opportunities for the application of telerobotic and robotic systems. The Laboratory Telerobotic Manipulator (LTM) is a highly advanced 7 degrees-of-freedom (DOF) telerobotic/robotic manipulator. It was developed and built for the Automation Technology Branch at NASA's Langley Research Center (LaRC) for work in research and to demonstrate ground-based telerobotic manipulator system hardware and software systems for future NASA applications in the hazardous environment of space. The LTM manipulator uses an embedded wiring design with all electronics, motor power, and control and communication cables passing through the pitch-yaw differential joints. This design requires the number of cables passing through the pitch/yaw joint to be kept to a minimum. To eliminate the cables needed to carry each pitch-yaw joint's sensor data to the VME control computers, a custom-embedded electronics package for each manipulator joint was developed. The electronics package collects and sends the joint's sensor data to the VME control computers over a fiber optic cable. The electronics package consist of five individual subsystems: the VME Link Processor, the Joint Processor and the Joint Processor power supply in the joint module, the fiber optics communications system, and the electronics and motor power cabling.

  3. Trajectory Optimization of Electric Aircraft Subject to Subsystem Thermal Constraints

    Science.gov (United States)

    Falck, Robert D.; Chin, Jeffrey C.; Schnulo, Sydney L.; Burt, Jonathan M.; Gray, Justin S.

    2017-01-01

    Electric aircraft pose a unique design challenge in that they lack a simple way to reject waste heat from the power train. While conventional aircraft reject most of their excess heat in the exhaust stream, for electric aircraft this is not an option. To examine the implications of this challenge on electric aircraft design and performance, we developed a model of the electric subsystems for the NASA X-57 electric testbed aircraft. We then coupled this model with a model of simple 2D aircraft dynamics and used a Legendre-Gauss-Lobatto collocation optimal control approach to find optimal trajectories for the aircraft with and without thermal constraints. The results show that the X-57 heat rejection systems are well designed for maximum-range and maximum-efficiency flight, without the need to deviate from an optimal trajectory. Stressing the thermal constraints by reducing the cooling capacity or requiring faster flight has a minimal impact on performance, as the trajectory optimization technique is able to find flight paths which honor the thermal constraints with relatively minor deviations from the nominal optimal trajectory.

  4. Preliminary design report for OTEC stationkeeping subsystems (SKSS)

    Energy Technology Data Exchange (ETDEWEB)

    1979-12-12

    Lockheed Ocean Systems with IMODCO prepared these preliminary designs for OTEC Stationkeeping Subsystems (SKSS) under contract to NOAA in support of the Department of Energy OTEC program. The results of Tasks III, V, and VI are presented in this design report. The report consists of five sections: introduction, preliminary designs for the multiple anchor leg (MAL) and tension anchor leg (TAL), costs and schedule, and conclusions. Extensive appendixes provide detailed descriptions of design methodology and include backup calculations and data to support the results presented. The objective of this effort is to complete the preliminary designs for the barge-MAL and Spar-TAL SKSS. A set of drawings is provided for each which show arrangements, configuration, component details, engineering description, and deployment plan. Loads analysis, performance assessment, and sensitivity to requirements are presented, together with the methodology employed to analyze the systems and to derive the results presented. Life cycle costs and schedule are prepared and compared on a common basis. Finally, recommendations for the Commercial Plant SKSS are presented for both platform types.

  5. Upgrade of ESO's FIERA CCD Controller and PULPO Subsystem

    Science.gov (United States)

    Reyes-Moreno, J.; Geimer, C.; Balestra, A.; Haddad, N.

    An overview of FIERA is presented with emphasis on its recent upgrade to PCI. The PCI board hosts two DSPs, one for real time control of the camera and another for on-the-fly processing of the incoming video data. In addition, the board is able to make DMA transfers, to synchronize to other boards alike, to be synchronized by a TIM bus and to control PULPO via RS232. The design is based on the IOP480 chip from PLX, for which we have developed a device driver for both Solaris and Linux. One computer is able to host more than one board and therefore can control an array of FIERA detector electronics. PULPO is a multifunctional subsystem widely used at ESO for the housekeeping of CCD cryostat heads and for shutter control. The upgrade of PULPO is based on an embedded PC running Linux. The upgraded PULPO is able to handle 29 temperature sensors, control 8 heaters and one shutter, read out one vacuum sensor and log any combination of parameters.

  6. The RAST Server: Rapid Annotations using Subsystems Technology

    Directory of Open Access Journals (Sweden)

    Overbeek Ross A

    2008-02-01

    Full Text Available Abstract Background The number of prokaryotic genome sequences becoming available is growing steadily and is growing faster than our ability to accurately annotate them. Description We describe a fully automated service for annotating bacterial and archaeal genomes. The service identifies protein-encoding, rRNA and tRNA genes, assigns functions to the genes, predicts which subsystems are represented in the genome, uses this information to reconstruct the metabolic network and makes the output easily downloadable for the user. In addition, the annotated genome can be browsed in an environment that supports comparative analysis with the annotated genomes maintained in the SEED environment. The service normally makes the annotated genome available within 12–24 hours of submission, but ultimately the quality of such a service will be judged in terms of accuracy, consistency, and completeness of the produced annotations. We summarize our attempts to address these issues and discuss plans for incrementally enhancing the service. Conclusion By providing accurate, rapid annotation freely to the community we have created an important community resource. The service has now been utilized by over 120 external users annotating over 350 distinct genomes.

  7. Advanced Space Suit Portable Life Support Subsystem Packaging Design

    Science.gov (United States)

    Howe, Robert; Diep, Chuong; Barnett, Bob; Thomas, Gretchen; Rouen, Michael; Kobus, Jack

    2006-01-01

    This paper discusses the Portable Life Support Subsystem (PLSS) packaging design work done by the NASA and Hamilton Sundstrand in support of the 3 future space missions; Lunar, Mars and zero-g. The goal is to seek ways to reduce the weight of PLSS packaging, and at the same time, develop a packaging scheme that would make PLSS technology changes less costly than the current packaging methods. This study builds on the results of NASA s in-house 1998 study, which resulted in the "Flex PLSS" concept. For this study the present EMU schematic (low earth orbit) was used so that the work team could concentrate on the packaging. The Flex PLSS packaging is required to: protect, connect, and hold the PLSS and its components together internally and externally while providing access to PLSS components internally for maintenance and for technology change without extensive redesign impact. The goal of this study was two fold: 1. Bring the advanced space suit integrated Flex PLSS concept from its current state of development to a preliminary design level and build a proof of concept mockup of the proposed design, and; 2. "Design" a Design Process, which accommodates both the initial Flex PLSS design and the package modifications, required to accommodate new technology.

  8. From Voids to Coma: the prevalence of pre-processing in the local Universe

    CERN Document Server

    Cybulski, Ryan; Fazio, Giovanni G; Gutermuth, Robert A

    2014-01-01

    We examine the effects of pre-processing across the Coma Supercluster, including 3505 galaxies over 500 sq deg, by quantifying the degree to which star-forming (SF) activity is quenched as a function of environment. We characterise environment using the complementary techniques of Voronoi Tessellation, to measure the density field, and the Minimal Spanning Tree, to define continuous structures, and so we measure SF activity as a function of local density and the type of environment (cluster, group, filament, and void), and quantify the degree to which environment contributes to quenching of SF activity. Our sample covers over two orders of magnitude in stellar mass (10^8.5 to 10^11 Msun), and consequently we trace the effects of environment on SF activity for dwarf and massive galaxies, distinguishing so-called "mass quenching" from "environment quenching". Environmentally-driven quenching of SF activity, measured relative to the void galaxies, occurs to progressively greater degrees in filaments, groups, and...

  9. Joint preprocesser-based detector for cooperative networks with limited hardware processing capability

    KAUST Repository

    Abuzaid, Abdulrahman I.

    2015-02-01

    In this letter, a joint detector for cooperative communication networks is proposed when the destination has limited hardware processing capability. The transmitter sends its symbols with the help of L relays. As the destination has limited hardware, only U out of L signals are processed and the energy of the remaining relays is lost. To solve this problem, a joint preprocessing based detector is proposed. This joint preprocessor based detector operate on the principles of minimizing the symbol error rate (SER). For a realistic assessment, pilot symbol aided channel estimation is incorporated for this proposed detector. From our simulations, it can be observed that our proposed detector achieves the same SER performance as that of the maximum likelihood (ML) detector with all participating relays. Additionally, our detector outperforms selection combining (SC), channel shortening (CS) scheme and reduced-rank techniques when using the same U. Our proposed scheme has low computational complexity.

  10. DUAL CHANNEL SPEECH ENHANCEMENT USING HADAMARD-LMS ALGORITHM WITH DCT PREPROCESSING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    D.DEEPA,

    2010-09-01

    Full Text Available Speech enhancement and noise reduction have wide applications in speech processing. They are often employed as a pre-processing stage in various applications. Two points are often required to be considered in signal de-noising applications: eliminating the undesired noise from signal to improve the Signal to noise Ratio(SNR and preserving the shape and characteristics of the original signal. Background noise in speech signal will reduce the speech intelligibility for people with hearing loss especially for sensorineural loss patients. The proposed algorithm describes Hadamard - Least Mean Square algorithm with DCT pre processing technique to improve the SNR and to reduce the mean square error (MSE. The DCT has separable, and energy compaction property. Although the DCT does not separate frequencies, it is a powerful signal decorrelator. It is a real valued function and thus can be effectively used in real-time operation.

  11. Synthetic aperture radar image correlation by use of preprocessing for enhancement of scattering centers.

    Science.gov (United States)

    Khoury, J; Gianino, P D; Woods, C L

    2000-10-15

    We demonstrate that a significant improvement can be obtained in the recognition of complicated synthetic aperture radar images taken from the Moving and Stationary Target Acquisitions and Recognition database. These images typically have a low number of scattering centers and high noise. We first preprocess the images and the templates formed from them so that their scattering centers are enhanced. Our technique can produce high-quality performance in several correlation criteria. For realistic automatic target recognition systems, our approach should make it easy to implement optical recognition systems with binarized data for many different types of correlation filter and should have a great effect on feeding data-compressed (binarized) information into either digital or optical processors.

  12. [Sample preprocessing method for residual quinolones in honey using immunoaffinity resin].

    Science.gov (United States)

    Ihara, Yoshiharu; Kato, Mihoko; Kodaira, Tsukasa; Itoh, Shinji; Terakawa, Mika; Horie, Masakazu; Saito, Koichi; Nakazawa, Hiroyuki

    2009-06-01

    A sample preparation method was developed for determination of quinolones in honey using immunoaffinity resin. For this purpose, an immunoaffinity resin for quinolones was prepared by coupling a quinolone-specific monoclonal antibody to agarose resin. Honey samples diluted with phosphate buffer were reacted with immunoaffinity resin. After the resin was washed, quinolones were eluted with glycine-HCl. Quinolones in the eluate were determined by HPLC with fluorescence detection. No interfering peak was found on the chromatograms of honey samples. The recoveries of quinolones from samples were over 70% at fortification levels of 20 ng/g (for norfloxacin, ciprofloxacin and enrofloxacin) and 10 ng/g (for danofloxacin). The quantification limits of quinolones were 2 ng/g. This sample preprocessing method using immunoaffinity resin was found to be effective and suitable for determining residual quinolones in honey.

  13. Base resolution methylome profiling: considerations in platform selection, data preprocessing and analysis.

    Science.gov (United States)

    Sun, Zhifu; Cunningham, Julie; Slager, Susan; Kocher, Jean-Pierre

    2015-08-01

    Bisulfite treatment-based methylation microarray (mainly Illumina 450K Infinium array) and next-generation sequencing (reduced representation bisulfite sequencing, Agilent SureSelect Human Methyl-Seq, NimbleGen SeqCap Epi CpGiant or whole-genome bisulfite sequencing) are commonly used for base resolution DNA methylome research. Although multiple tools and methods have been developed and used for the data preprocessing and analysis, confusions remains for these platforms including how and whether the 450k array should be normalized; which platform should be used to better fit researchers' needs; and which statistical models would be more appropriate for differential methylation analysis. This review presents the commonly used platforms and compares the pros and cons of each in methylome profiling. We then discuss approaches to study design, data normalization, bias correction and model selection for differentially methylated individual CpGs and regions.

  14. Feasibility investigation of integrated optics Fourier transform devices. [holographic subtraction for multichannel data preprocessing

    Science.gov (United States)

    Verber, C. M.; Vahey, D. W.; Wood, V. E.; Kenan, R. P.; Hartman, N. F.

    1977-01-01

    The possibility of producing an integrated optics data processing device based upon Fourier transformations or other parallel processing techniques, and the ways in which such techniques may be used to upgrade the performance of present and projected NASA systems were investigated. Activities toward this goal include; (1) production of near-diffraction-limited geodesic lenses in glass waveguides; (2) development of grinding and polishing techniques for the production of geodesic lenses in LiNbO3 waveguides; (3) development of a characterization technique for waveguide lenses; and (4) development of a theory for corrected aspheric geodesic lenses. A holographic subtraction system was devised which should be capable of rapid on-board preprocessing of a large number of parallel data channels. The principle involved is validated in three demonstrations.

  15. Data preprocessing method for fluorescence molecular tomography using a priori information provided by CT.

    Science.gov (United States)

    Fu, Jianwei; Yang, Xiaoquan; Meng, Yuanzheng; Luo, Qingming; Gong, Hui

    2012-01-01

    The combined system of micro-CT and fluorescence molecular tomography (FMT) offers a new tool to provide anatomical and functional information of small animals in a single study. To take advantages of the combined system, a data preprocessing method is proposed to extract the valid data for FMT reconstruction algorithms using a priori information provided by CT. The boundary information of the animal and animal holder is extracted from reconstructed CT volume data. A ray tracing method is used to trace the path of the excitation beam, calculate the locations and directions of the optional sources and determine whether the optional sources are valid. To accurately calculate the projections of the detectors on optical images and judge their validity, a combination of perspective projection and inverse ray tracing method are adopted to offer optimal performance. The imaging performance of the combined system with the presented method is validated through experimental rat imaging.

  16. Analyzing ChIP-seq data: preprocessing, normalization, differential identification, and binding pattern characterization.

    Science.gov (United States)

    Taslim, Cenny; Huang, Kun; Huang, Tim; Lin, Shili

    2012-01-01

    Chromatin immunoprecipitation followed by sequencing (ChIP-seq) is a high-throughput antibody-based method to study genome-wide protein-DNA binding interactions. ChIP-seq technology allows scientist to obtain more accurate data providing genome-wide coverage with less starting material and in shorter time compared to older ChIP-chip experiments. Herein we describe a step-by-step guideline in analyzing ChIP-seq data including data preprocessing, nonlinear normalization to enable comparison between different samples and experiments, statistical-based method to identify differential binding sites using mixture modeling and local false discovery rates (fdrs), and binding pattern characterization. In addition, we provide a sample analysis of ChIP-seq data using the steps provided in the guideline.

  17. Data pre-processing for quantification in tomography and radiography with a digital flat panel detector

    Science.gov (United States)

    Rinkel, Jean; Gerfault, Laurent; Estève, François; Dinten, Jean-Marc

    2006-03-01

    In order to obtain accurate quantitative results, flat panel detectors require specific calibration and correction of acquisitions. Main artefacts are due to bad pixels, variations of photodiodes characteristics and inhomogeneity of X-rays sensitivity of the scintillator layer. Other limitations for quantification are the non-linearity of the detector due to charge trapping in the transistors and the scattering generated inside the detector, called detector scattering. Based on physical models of artefacts generation, this paper presents an unified framework for the calibration and correction of these artefacts. The following specific algorithms have been developed to correct them. A new method for correction of deviation to linearity is based on the comparison between experimental and simulated data. A method of detector scattering correction is performed in two steps: off-line characterization of detector scattering by considering its spatial distribution through a convolution model and on-line correction based on a deconvolution approach. Radiographic results on an anthropomorphic thorax phantom imaged with a flat panel detector, that convert X-rays into visible light using scintillator coupled to an amorphous silicon transistor frame for photons to electrons conversion, demonstrate that experimental X-rays attenuation images are significantly improved qualitatively and quantitatively by applying non-linearity correction and detector scattering correction. Results obtained on tomographic reconstructions from pre-processed acquisitions of the phantom are in very good agreement with expected attenuation coefficients values obtained with a multi-slice CT scanner. Thus, this paper demonstrates the efficiency of the proposed pre-processings to perform accurate quantification on radiographies and tomographies.

  18. Pre-processing ambient noise cross-correlations with equalizing the covariance matrix eigenspectrum

    Science.gov (United States)

    Seydoux, Léonard; de Rosny, Julien; Shapiro, Nikolai M.

    2017-09-01

    Passive imaging techniques from ambient seismic noise requires a nearly isotropic distribution of the noise sources in order to ensure reliable traveltime measurements between seismic stations. However, real ambient seismic noise often partially fulfils this condition. It is generated in preferential areas (in deep ocean or near continental shores), and some highly coherent pulse-like signals may be present in the data such as those generated by earthquakes. Several pre-processing techniques have been developed in order to attenuate the directional and deterministic behaviour of this real ambient noise. Most of them are applied to individual seismograms before cross-correlation computation. The most widely used techniques are the spectral whitening and temporal smoothing of the individual seismic traces. We here propose an additional pre-processing to be used together with the classical ones, which is based on the spatial analysis of the seismic wavefield. We compute the cross-spectra between all available stations pairs in spectral domain, leading to the data covariance matrix. We apply a one-bit normalization to the covariance matrix eigenspectrum before extracting the cross-correlations in the time domain. The efficiency of the method is shown with several numerical tests. We apply the method to the data collected by the USArray, when the M8.8 Maule earthquake occurred on 2010 February 27. The method shows a clear improvement compared with the classical equalization to attenuate the highly energetic and coherent waves incoming from the earthquake, and allows to perform reliable traveltime measurement even in the presence of the earthquake.

  19. Image pre-processing method for near-wall PIV measurements over moving curved interfaces

    Science.gov (United States)

    Jia, L. C.; Zhu, Y. D.; Jia, Y. X.; Yuan, H. J.; Lee, C. B.

    2017-03-01

    PIV measurements near a moving interface are always difficult. This paper presents a PIV image pre-processing method that returns high spatial resolution velocity profiles near the interface. Instead of re-shaping or re-orientating the interrogation windows, interface tracking and an image transformation are used to stretch the particle image strips near a curved interface into rectangles. Then the adaptive structured interrogation windows can be arranged at specified distances from the interface. Synthetic particles are also added into the solid region to minimize interfacial effects and to restrict particles on both sides of the interface. Since a high spatial resolution is only required in high velocity gradient region, adaptive meshing and stretching of the image strips in the normal direction is used to improve the cross-correlation signal-to-noise ratio (SN) by reducing the velocity difference and the particle image distortion within the interrogation window. A two dimensional Gaussian fit is used to compensate for the effects of stretching particle images. The working hypothesis is that fluid motion near the interface is ‘quasi-tangential flow’, which is reasonable in most fluid-structure interaction scenarios. The method was validated against the window deformation iterative multi-grid scheme (WIDIM) using synthetic image pairs with different velocity profiles. The method was tested for boundary layer measurements of a supersonic turbulent boundary layer on a flat plate, near a rotating blade and near a flexible flapping flag. This image pre-processing method provides higher spatial resolution than conventional WIDIM and good robustness for measuring velocity profiles near moving interfaces.

  20. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.