WorldWideScience

Sample records for based model-free analysis

  1. Model-free linkage analysis of a binary trait.

    Science.gov (United States)

    Xu, Wei; Bull, Shelley B; Mirea, Lucia; Greenwood, Celia M T

    2012-01-01

    Genetic linkage analysis aims to detect chromosomal regions containing genes that influence risk of specific inherited diseases. The presence of linkage is indicated when a disease or trait cosegregates through the families with genetic markers at a particular region of the genome. Two main types of genetic linkage analysis are in common use, namely model-based linkage analysis and model-free linkage analysis. In this chapter, we focus solely on the latter type and specifically on binary traits or phenotypes, such as the presence or absence of a specific disease. Model-free linkage analysis is based on allele-sharing, where patterns of genetic similarity among affected relatives are compared to chance expectations. Because the model-free methods do not require the specification of the inheritance parameters of a genetic model, they are preferred by many researchers at early stages in the study of a complex disease. We introduce the history of model-free linkage analysis in Subheading 1. Table 1 describes a standard model-free linkage analysis workflow. We describe three popular model-free linkage analysis methods, the nonparametric linkage (NPL) statistic, the affected sib-pair (ASP) likelihood ratio test, and a likelihood approach for pedigrees. The theory behind each linkage test is described in this section, together with a simple example of the relevant calculations. Table 4 provides a summary of popular genetic analysis software packages that implement model-free linkage models. In Subheading 2, we work through the methods on a rich example providing sample software code and output. Subheading 3 contains notes with additional details on various topics that may need further consideration during analysis.

  2. XPC Ala499Val and XPG Asp1104His polymorphisms and digestive system cancer risk: a meta-analysis based on model-free approach.

    Science.gov (United States)

    Yu, Guangsheng; Wang, Jianlu; Dong, Jiahong; Liu, Jun

    2015-01-01

    Many studies have reported the association between XPC Ala499Val and XPG Asp1104His polymorphisms and digestive system cancer susceptibility, but the results were inconclusive. We performed a meta-analysis, using a comprehensive strategy based on the allele model and a model-free approach, to derive a more precise estimation of the relationship between XPC Ala499Val and XPG Asp1104His polymorphisms with digestive system cancer risk. For XPC Ala499Val, no significant cancer risk was found in the allele model (OR = 0.98, 95% CI: 0.86-1.11) and with model-free approach (ORG = 0.97, 95% CI: 0.83-1.13). For XPG Asp1104His, there was also no association between this polymorphism and cancer risk in the allele model (OR = 1.03, 95% CI: 0.96-1.11) and with the model-free approach (ORG = 1.04, 95% CI: 0.95-1.14). Therefore, this meta-analysis suggests that the XPC Ala499Val and XPG Asp1104His polymorphisms were not associated with digestive system cancer risk. Further large and well-designed studies are needed to confirm these findings.

  3. Model-free data analysis for source separation based on Non-Negative Matrix Factorization and k-means clustering (NMFk)

    Science.gov (United States)

    Vesselinov, V. V.; Alexandrov, B.

    2014-12-01

    The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the

  4. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Simon Benjaminsson

    2010-08-01

    Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.

  5. The effect of polymer matrices on the thermal hazard properties of RDX-based PBXs by using model-free and combined kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Qi-Long, E-mail: terry.well@163.com [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zeman, Svatopluk, E-mail: svatopluk.zeman@upce.cz [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Sánchez Jiménez, P.E. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zhao, Feng-Qi [Science and Technology on Combustion and Explosion Laboratory, Xi’an Modern Chemistry Research Institute, 710065 Xi’an (China); Pérez-Maqueda, L.A. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Málek, Jiří [Department of Physical Chemistry, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic)

    2014-04-01

    Highlights: • Nonisothermal decomposition kinetics of RDX and its PBXs has been investigated. • The kinetic models are determined by both master plot and combined kinetic analysis methods. • The constant rate temperature profiles and isothermal curves are predicted by obtained kinetic triplets. • The storage safety parameters are simulated based on thermal explosion theory. - Abstract: In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82 °C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami–Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82 °C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature.

  6. The effect of polymer matrices on the thermal hazard properties of RDX-based PBXs by using model-free and combined kinetic analysis.

    Science.gov (United States)

    Yan, Qi-Long; Zeman, Svatopluk; Sánchez Jiménez, P E; Zhao, Feng-Qi; Pérez-Maqueda, L A; Málek, Jiří

    2014-04-30

    In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82°C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami-Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82°C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature. PMID:24657941

  7. Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code

    DEFF Research Database (Denmark)

    Zhang, Yunqian; Chen, Zhe; Cheng, Ming;

    2011-01-01

    As the wind turbine is a nonlinear high-order system, to achieve good pitch control performance, model free adaptive control (MFAC) approach which doesn't need the mathematical model of the wind turbine is adopted in the pitch control system in this paper. A pseudo gradient vector whose estimation...... value is only based on I/O data of the wind turbine is identified and then the wind turbine system is replaced by a dynamic linear time-varying model. In order to verify the correctness and robustness of the proposed model free adaptive pitch controller, the wind turbine code FAST which can predict the...

  8. Neural computations underlying arbitration between model-based and model-free learning.

    Science.gov (United States)

    Lee, Sang Wan; Shimojo, Shinsuke; O'Doherty, John P

    2014-02-01

    There is accumulating neural evidence to support the existence of two distinct systems for guiding action selection, a deliberative "model-based" and a reflexive "model-free" system. However, little is known about how the brain determines which of these systems controls behavior at one moment in time. We provide evidence for an arbitration mechanism that allocates the degree of control over behavior by model-based and model-free systems as a function of the reliability of their respective predictions. We show that the inferior lateral prefrontal and frontopolar cortex encode both reliability signals and the output of a comparison between those signals, implicating these regions in the arbitration process. Moreover, connectivity between these regions and model-free valuation areas is negatively modulated by the degree of model-based control in the arbitrator, suggesting that arbitration may work through modulation of the model-free valuation system when the arbitrator deems that the model-based system should drive behavior.

  9. Gaze data reveal distinct choice processes underlying model-based and model-free reinforcement learning.

    Science.gov (United States)

    Konovalov, Arkady; Krajbich, Ian

    2016-01-01

    Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383

  10. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    Directory of Open Access Journals (Sweden)

    Anya eSkatova

    2013-09-01

    Full Text Available Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, versus another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans’ scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.

  11. Model free approach to kinetic analysis of real-time hyperpolarized 13C magnetic resonance spectroscopy data.

    Directory of Open Access Journals (Sweden)

    Deborah K Hill

    Full Text Available Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized (13C metabolic imaging in humans, where measurement of the input function can be problematic.

  12. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  13. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    Directory of Open Access Journals (Sweden)

    Zhe Dong

    2016-01-01

    Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.

  14. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  15. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S; Kitao, A; Berendsen, HJC

    1997-01-01

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by m

  16. Model-free control

    OpenAIRE

    Fliess, Michel; Join, Cédric

    2013-01-01

    International audience; ''Model-free control'' and the corresponding ''intelligent'' PID controllers (iPIDs), which already had many successful concrete applications, are presented here for the first time in an unified manner, where the new advances are taken into account. The basics of model-free control is now employing some old functional analysis and some elementary differential algebra. The estimation techniques become quite straightforward via a recent online parameter identification ap...

  17. Model-based learning and the contribution of the orbitofrontal cortex to the model-free world.

    Science.gov (United States)

    McDannald, Michael A; Takahashi, Yuji K; Lopatina, Nina; Pietras, Brad W; Jones, Josh L; Schoenbaum, Geoffrey

    2012-04-01

    Learning is proposed to occur when there is a discrepancy between reward prediction and reward receipt. At least two separate systems are thought to exist: one in which predictions are proposed to be based on model-free or cached values; and another in which predictions are model-based. A basic neural circuit for model-free reinforcement learning has already been described. In the model-free circuit the ventral striatum (VS) is thought to supply a common-currency reward prediction to midbrain dopamine neurons that compute prediction errors and drive learning. In a model-based system, predictions can include more information about an expected reward, such as its sensory attributes or current, unique value. This detailed prediction allows for both behavioral flexibility and learning driven by changes in sensory features of rewards alone. Recent evidence from animal learning and human imaging suggests that, in addition to model-free information, the VS also signals model-based information. Further, there is evidence that the orbitofrontal cortex (OFC) signals model-based information. Here we review these data and suggest that the OFC provides model-based information to this traditional model-free circuitry and offer possibilities as to how this interaction might occur.

  18. Integrating cortico-limbic-basal ganglia architectures for learning model-based and model-free navigation strategies

    Directory of Open Access Journals (Sweden)

    Mehdi eKhamassi

    2012-11-01

    Full Text Available Behaviour in spatial navigation is often organised into map-based (place-driven versus map-free (cue-driven strategies; behaviour in operant conditioning research is often organised into goal-directed versus habitual strategies. Here we attempt to unify the two. We review one powerful theory for distinct forms of learning during instrumental conditioning, namely model-based (maintaining a representation of the world and model-free (reacting to immediate stimuli learning algorithms. We extend these lines of argument to propose an alternative taxonomy for spatial navigation, showing how various previously identified strategies can be distinguished as model-based or model-free depending on the usage of information and not on the type of information (e.g. cue vs place. We argue that identifying model-free learning with dorsolateral striatum and model-based learning with dorsomedial striatum could reconcile numerous conflicting results in the spatial navigation literature. From this perspective, we further propose that the ventral striatum plays key roles in the model-building process. We propose that the core of the ventral striatum is positioned to learn the probability of action selection for every transition between states of the world. We further review suggestions that the ventral striatal core and shell are positioned to act as critics contributing to the computation of a reward prediction error for model-free and model-based systems, respectively.

  19. Model-free functional MRI analysis for detecting low-frequency functional connectivity in the human brain

    Science.gov (United States)

    Wismueller, Axel; Lange, Oliver; Auer, Dorothee; Leinsinger, Gerda

    2010-03-01

    Slowly varying temporally correlated activity fluctuations between functionally related brain areas have been identified by functional magnetic resonance imaging (fMRI) research in recent years. These low-frequency oscillations of less than 0.08 Hz appear to play a major role in various dynamic functional brain networks, such as the so-called 'default mode' network. They also have been observed as a property of symmetric cortices, and they are known to be present in the motor cortex among others. These low-frequency data are difficult to detect and quantify in fMRI. Traditionally, user-based regions of interests (ROI) or 'seed clusters' have been the primary analysis method. In this paper, we propose unsupervised clustering algorithms based on various distance measures to detect functional connectivity in resting state fMRI. The achieved results are evaluated quantitatively for different distance measures. The Euclidian metric implemented by standard unsupervised clustering approaches is compared with a non-metric topographic mapping of proximities based on the the mutual prediction error between pixel-specific signal dynamics time-series. It is shown that functional connectivity in the motor cortex of the human brain can be detected based on such model-free analysis methods for resting state fMRI.

  20. A Model-free Approach to Fault Detection of Continuous-time Systems Based on Time Domain Data

    Institute of Scientific and Technical Information of China (English)

    Ping Zhang; Steven X. Ding

    2007-01-01

    In this paper, a model-free approach is presented to design an observer-based fault detection system of linear continuoustime systems based on input and output data in the time domain. The core of the approach is to directly identify parameters of the observer-based residual generator based on a numerically reliable data equation obtained by filtering and sampling the input and output signals.

  1. Landmark-based model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk; Broemme, A.; Busch, C.

    2013-01-01

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence potentially useful video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm b

  2. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    OpenAIRE

    Skatova, Anya; Chan, Patricia A.; Daw, Nathaniel D.

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs....

  3. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    OpenAIRE

    Anya eSkatova; Patricia Angie Chan; Daw, Nathaniel D.

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, ver...

  4. Thermokinetics analysis of biomass based on model-free different heating rate method%基于多升温速率法的典型生物质热动力学分析

    Institute of Scientific and Technical Information of China (English)

    田宜水; 王茹

    2016-01-01

    为研究典型生物质热动力学,判断反应机理,获得反应的动力学速率参数,该文采用热重分析技术对玉米秸秆、小麦秸秆、棉秆、松树木屑、花生壳、甜高粱渣等生物质原料进行了氮气气氛下不同升温速率的热解特性试验研究,利用Friedman法、Flynn-Wall-Ozawa法计算活化能,用Malek法确定最概然机理函数,建立了生物质热分析动力学模型,并讨论了不同生物质的差异性。结果表明:生物质的热解过程均包括3个主要阶段:干燥预热阶段、挥发分析出阶段、碳化阶段。典型生物质活化能随着转化率的增加而增加,在挥发分析出阶段,热解活化能介于144.61~167.34 kJ/mol之间;反应动力学机理均符合Avrami-Erofeev函数,但反应级数有一定的差异;指前因子介于26.66~33.97 s-1之间。这为生物质热化学转化过程工艺条件的优化及工程放大提供理论依据。%Thermokinetics analysis can test the relationship between physical and chemical properties of material and temperature through controlling heating rate. Through thermokinetics analysis, we can study the combustion, pyrolysis and gasification reaction kinetics of biomass, decide the reaction kinetics model and calculate the reaction kinetics parameters, such as activation energy and pre-exponential factor. In the article, we chose 6 kinds of biomass raw materials, including corn straw, wheat straw, cotton stalk, pine sawdust, peanut shell, and residue of sweet sorghum. The thermal gravity analysis (TG) experiments were carried out, and 8 loss curves were obtained under non-isothermal conditions at linear heating rate of 5, 10, 20 and 30℃/min. The 99.99% nitrogen continuously passed and the temperature rose from room temperature to 600℃. The initial sample weight was always within the range of 3-4 mg. The method of different heating rates was applied to non-isothermal data. The Friedman method and the

  5. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  6. Model-free control

    Science.gov (United States)

    Fliess, Michel; Join, Cédric

    2013-12-01

    'Model-free control'and the corresponding 'intelligent' PID controllers (iPIDs), which already had many successful concrete applications, are presented here for the first time in an unified manner, where the new advances are taken into account. The basics of model-free control is now employing some old functional analysis and some elementary differential algebra. The estimation techniques become quite straightforward via a recent online parameter identification approach. The importance of iPIs and especially of iPs is deduced from the presence of friction. The strange industrial ubiquity of classic PIDs and the great difficulty for tuning them in complex situations is deduced, via an elementary sampling, from their connections with iPIDs. Several numerical simulations are presented which include some infinite-dimensional systems. They demonstrate not only the power of our intelligent controllers but also the great simplicity for tuning them.

  7. Model-free analysis of quadruply imaged gravitationally lensed systems and substructured galaxies

    CERN Document Server

    Woldesenbet, Addishiwot Girma

    2015-01-01

    Multiple image gravitational lens systems, and especially quads are invaluable in determining the amount and distribution of mass in galaxies. This is usually done by mass modeling using parametric or free-form methods. An alternative way of extracting information about lens mass distribution is to use lensing degeneracies and invariants. Where applicable, they allow one to make conclusions about whole classes of lenses without model fitting. Here, we use approximate, but observationally useful invariants formed by the three relative polar angles of quad images around the lens center to show that many smooth elliptical+shear lenses can reproduce the same set of quad image angles within observational error. This result allows us to show in a model-free way what the general class of smooth elliptical+shear lenses looks like in the three dimensional (3D) space of image relative angles, and that this distribution does not match that of the observed quads. We conclude that, even though smooth elliptical+shear lens...

  8. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS).

  9. Vision-Based Autonomous Underwater Vehicle Navigation in Poor Visibility Conditions Using a Model-Free Robust Control

    Directory of Open Access Journals (Sweden)

    Ricardo Pérez-Alcocer

    2016-01-01

    Full Text Available This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem, we have adapted the lαβ color space to identify features of interest in underwater images even in extreme visibility conditions. To guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our visual navigation system in real environments showing the feasibility of our approach.

  10. Language acquisition is model-based rather than model-free.

    Science.gov (United States)

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  11. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    OpenAIRE

    Zhe Dong

    2016-01-01

    The modular high temperature gas-cooled reactor (MHTGR) is a typical small modular reactor (SMR) that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the mult...

  12. Model-free fuzzy control of a magnetorheological elastomer vibration isolation system: analysis and experimental evaluation

    Science.gov (United States)

    Fu, Jie; Li, Peidong; Wang, Yuan; Liao, Guanyao; Yu, Miao

    2016-03-01

    This paper addresses the problem of micro-vibration control of a precision vibration isolation system with a magnetorheological elastomer (MRE) isolator and fuzzy control strategy. Firstly, a polyurethane matrix MRE isolator working in the shear-compression mixed mode is introduced. The dynamic characteristic is experimentally tested, and the range of the frequency shift and the model parameters of the MRE isolator are obtained from experimental results. Secondly, a new semi-active control law is proposed, which uses isolation structure displacement and relative displacement between the isolation structure and base as the inputs. Considering the nonlinearity of the MRE isolator and the excitation uncertainty of an isolation system, the designed semi-active fuzzy logic controller (FLC) is independent of a system model and is robust. Finally, the numerical simulations and experiments are conducted to evaluate the performance of the FLC with single-frequency and multiple-frequency excitation, respectively, and the experimental results show that the acceleration transmissibility is reduced by 54.04% at most, which verifies the effectiveness of the designed semi-active FLC. Moreover, the advantages of the approach are demonstrated in comparison to the passive control and ON-OFF control.

  13. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  14. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  15. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Directory of Open Access Journals (Sweden)

    Markus Helmer

    Full Text Available Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain. They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even

  16. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  17. Dynamics of GCN4 facilitate DNA interaction: a model-free analysis of an intrinsically disordered region.

    Science.gov (United States)

    Gill, Michelle L; Byrd, R Andrew; Palmer Iii, Arthur G

    2016-02-17

    Intrinsically disordered proteins (IDPs) and proteins with intrinsically disordered regions (IDRs) are known to play important roles in regulatory and signaling pathways. A critical aspect of these functions is the ability of IDP/IDRs to form highly specific complexes with target molecules. However, elucidation of the contributions of conformational dynamics to function has been limited by challenges associated with structural heterogeneity of IDP/IDRs. Using NMR spin relaxation parameters ((15)N R1, (15)N R2, and {(1)H}-(15)N heteronuclear NOE) collected at four static magnetic fields ranging from 14.1 to 21.1 T, we have analyzed the backbone dynamics of the basic leucine-zipper (bZip) domain of the Saccharomyces cerevisiae transcription factor GCN4, whose DNA binding domain is intrinsically disordered in the absence of DNA substrate. We demonstrate that the extended model-free analysis can be applied to proteins with IDRs such as apo GCN4 and that these results significantly extend previous NMR studies of GCN4 dynamics performed using a single static magnetic field of 11.74 T [Bracken, et al., J. Mol. Biol., 1999, 285, 2133-2146] and correlate well with molecular dynamics simulations [Robustelli, et al., J. Chem. Theory Comput., 2013, 9, 5190-5200]. In contrast to the earlier work, data at multiple static fields allows the time scales of internal dynamics of GCN4 to be reliably quantified. Large amplitude dynamic fluctuations in the DNA-binding region have correlation times (τs ≈ 1.4-2.5 ns) consistent with a two-step mechanism in which partially ordered bZip conformations of GCN4 form initial encounter complexes with DNA and then rapidly rearrange to the high affinity state with fully formed basic region recognition helices. PMID:26661739

  18. Re-evaluation of the model-free analysis of fast internal motion in proteins using NMR relaxation.

    Science.gov (United States)

    Frederick, Kendra King; Sharp, Kim A; Warischalk, Nicholas; Wand, A Joshua

    2008-09-25

    NMR spin relaxation retains a central role in the characterization of the fast internal motion of proteins and their complexes. Knowledge of the distribution and amplitude of the motion of amino acid side chains is critical for the interpretation of the dynamical proxy for the residual conformational entropy of proteins, which can potentially significantly contribute to the entropy of protein function. A popular treatment of NMR relaxation phenomena in macromolecules dissolved in liquids is the so-called model-free approach of Lipari and Szabo. The robustness of the mode-free approach has recently been strongly criticized and the remarkable range and structural context of the internal motion of proteins, characterized by such NMR relaxation techniques, attributed to artifacts arising from the model-free treatment, particularly with respect to the symmetry of the underlying motion. We develop an objective quantification of both spatial and temporal asymmetry of motion and re-examine the foundation of the model-free treatment. Concerns regarding the robustness of the model-free approach to asymmetric motion appear to be generally unwarranted. The generalized order parameter is robustly recovered. The sensitivity of the model-free treatment to asymmetric motion is restricted to the effective correlation time, which is by definition a normalized quantity and not a true time constant and therefore of much less interest in this context. With renewed confidence in the model-free approach, we then examine the microscopic distribution of side chain motion in the complex between calcium-saturated calmodulin and the calmodulin-binding domain of the endothelial nitric oxide synthase. Deuterium relaxation is used to characterize the motion of methyl groups in the complex. A remarkable range of Lipari-Szabo model-free generalized order parameters are seen with little correlation with basic structural parameters such as the depth of burial. These results are contrasted with the

  19. Respective Advantages and Disadvantages of Model-based and Model-free Reinforcement Learning in a Robotics Neuro-inspired Cognitive Architecture

    OpenAIRE

    Renaudo, Erwan; Girard, Benoît; Chatila, Raja; Khamassi, Mehdi

    2015-01-01

    International audience Combining model-based and model-free reinforcement learning systems in robotic cognitive architectures appears as a promising direction to endow artificial agents with flexibility and decisional autonomy close to mammals. In particular, it could enable robots to build an internal model of the environment, plan within it in response to detected environmental changes, and avoid the cost and time of planning when the stability of the environment is recognized as enablin...

  20. Connectivity concordance mapping: a new tool for model-free analysis of fMRI data of the human brain

    Directory of Open Access Journals (Sweden)

    Gabriele eLohmann

    2012-03-01

    Full Text Available Functional magnetic resonance data acquired in a task-absent condition ("resting state'' require new data analysis techniques that do not depend on an activation model. Here, we propose a new analysis method called "Connectivity Concordance Mapping (CCM".The main idea is to assign a label to each voxel based on the reproducibility of its whole-brain pattern of connectivity. Specifically, we compute the correlations across measurements of each voxel's correlation-based functional connectivity map, resulting in a voxelwise map of concordance values. Regions of high interscan concordance can be assumed to be functionally consistent, and may thus be of specific interest for further analysis. Here we present two fMRI studies to test the algorithm. The first is a eyes open/eyes closed paradigm designed to highlight the potential of the method in a relatively simple state-dependent domain. The second study is a longitudinal repeated measurement of a patient following stroke. Longitudinal clinical studies such as this may represent the most interesting domain of applications for this algorithm, as it provides an exploratory means to identify changes in connectivity, such as those during post-stroke recovery.

  1. Connectivity Concordance Mapping: A New Tool for Model-Free Analysis of fMRI Data of the Human Brain

    Science.gov (United States)

    Lohmann, Gabriele; Ovadia-Caro, Smadar; Jungehülsing, Gerhard Jan; Margulies, Daniel S.; Villringer, Arno; Turner, Robert

    2011-01-01

    Functional magnetic resonance data acquired in a task-absent condition (“resting state”) require new data analysis techniques that do not depend on an activation model. Here, we propose a new analysis method called Connectivity Concordance Mapping (CCM). The main idea is to assign a label to each voxel based on the reproducibility of its whole-brain pattern of connectivity. Specifically, we compute the correlations of time courses of each voxel with every other voxel for each measurement. Voxels whose correlation pattern is consistent across measurements receive high values. The result of a CCM analysis is thus a voxel-wise map of concordance values. Regions of high inter-subject concordance can be assumed to be functionally consistent, and may thus be of specific interest for further analysis. Here we present two fMRI studies to demonstrate the possible applications of the algorithm. The first is a eyes-open/eyes-closed paradigm designed to highlight the potential of the method in a relatively simple domain. The second study is a longitudinal repeated measurement of a patient following stroke. Longitudinal clinical studies such as this may represent the most interesting domain of applications for this algorithm. PMID:22470320

  2. Whole-brain, time-locked activation with simple tasks revealed using massive averaging and model-free analysis

    Science.gov (United States)

    Gonzalez-Castillo, Javier; Saad, Ziad S.; Handwerker, Daniel A.; Inati, Souheil J.; Brenowitz, Noah; Bandettini, Peter A.

    2012-01-01

    The brain is the body's largest energy consumer, even in the absence of demanding tasks. Electrophysiologists report on-going neuronal firing during stimulation or task in regions beyond those of primary relationship to the perturbation. Although the biological origin of consciousness remains elusive, it is argued that it emerges from complex, continuous whole-brain neuronal collaboration. Despite converging evidence suggesting the whole brain is continuously working and adapting to anticipate and actuate in response to the environment, over the last 20 y, task-based functional MRI (fMRI) have emphasized a localizationist view of brain function, with fMRI showing only a handful of activated regions in response to task/stimulation. Here, we challenge that view with evidence that under optimal noise conditions, fMRI activations extend well beyond areas of primary relationship to the task; and blood-oxygen level-dependent signal changes correlated with task-timing appear in over 95% of the brain for a simple visual stimulation plus attention control task. Moreover, we show that response shape varies substantially across regions, and that whole-brain parcellations based on those differences produce distributed clusters that are anatomically and functionally meaningful, symmetrical across hemispheres, and reproducible across subjects. These findings highlight the exquisite detail lying in fMRI signals beyond what is normally examined, and emphasize both the pervasiveness of false negatives, and how the sparseness of fMRI maps is not a result of localized brain function, but a consequence of high noise and overly strict predictive response models. PMID:22431587

  3. Neuron model-free PID control

    Science.gov (United States)

    Wang, Ning; Zhang, Li; Wang, Shuqing

    2001-09-01

    Based on the neuron model and learning strategy, the neuron intelligent PID control system is set up in this paper. The neuron model-free PID control method is posed. The simulation tests with an example of a hydraulic turbine generator unit are made. The result show that god control performances are obtained. This new intelligent controller is very simple and has very strong adaptability and robustness. It can be used directly in practice.

  4. Model-free distributed learning

    Science.gov (United States)

    Dembo, Amir; Kailath, Thomas

    1990-01-01

    Model-free learning for synchronous and asynchronous quasi-static networks is presented. The network weights are continuously perturbed, while the time-varying performance index is measured and correlated with the perturbation signals; the correlation output determines the changes in the weights. The perturbation may be either via noise sources or orthogonal signals. The invariance to detailed network structure mitigates large variability between supposedly identical networks as well as implementation defects. This local, regular, and completely distributed mechanism requires no central control and involves only a few global signals. Thus it allows for integrated on-chip learning in large analog and optical networks.

  5. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. PMID:26918742

  6. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework.

  7. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)

    2014-05-15

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  8. Internal motions in yeast phenylalanine transfer RNA from 13C NMR relaxation rates of modified base methyl groups: a model-free approach

    International Nuclear Information System (INIS)

    Internal motions at specific locations through yeast phenylalanine tRNA were measured by using nucleic acid biosynthetically enriched in 13C at modified base methyl groups. Carbon NMR spectra of isotopically enriched tRNA/sup Phe/ reveal 12 individual peaks for 13 of the 14 methyl groups known to be present. The two methyls of N2, N2-dimethylguanosine (m22G-26) have indistinguishable resonances, whereas the fourteenth methyl bound to ring carbon-11 of the hypermodified nucleoside 3' adjacent to the anticodon, wyosine (Y-37), does not come from the [methyl-13C] methionine substrate. Assignments to individual nucleosides within the tRNA were made on the basis of chemical shifts of the mononucleosides and correlation of 13C resonances with proton NMR chemical shifts via two-dimensional heteronuclear proton-carbon correlation spectroscopy. Values of 13C longitudinal relaxation (T1) and the nuclear Overhauser enhancements (NOE) were determined at 22.5, 75.5, and 118 MHz for tRNA/sup Phe/ in a physiological buffer solution with 10 mM MgCl2, at 220C. These data were used to extract two physical parameters that define the system with regard to fast internal motion: the generalized order parameters (S2) and effective correlation times (tau/sub e/) for internal motion of the C-H internuclear vectors. For all methyl groups the generalized order parameter varied from 0.057 to 0.108, compared with the value of 0.111 predicted for a rapidly spinning methyl group rigidly mounted on a spherical macromolecule. Values of tau/sub e/ ranged from 4 to 16 ps, generally shorter times than measured in other work for amino acid methyl groups in several proteins. Somewhat surprising was the finding that the two methyl esters terminating the Y-37 side chain have order parameters similar to those of other methyls in tRNA and only 25% less than that for a methyl directly bonded to the base

  9. Model-free learning from demonstration

    OpenAIRE

    Billing, Erik; Hellström, Thomas; Janlert, Lars Erik

    2010-01-01

    A novel robot learning algorithm called Predictive Sequence Learning (PSL) is presented and evaluated. PSL is a model-free prediction algorithm inspired by the dynamic temporal difference algorithm S-Learning. While S-Learning has previously been applied as a reinforcement learning algorithm for robots, PSL is here applied to a Learning from Demonstration problem. The proposed algorithm is evaluated on four tasks using a Khepera II robot. PSL builds a model from demonstrated data which is use...

  10. Dynamic analysis of parallel mechanism and its model-free intelligent control%并联机构的动力学分析及其无模型智能控制

    Institute of Scientific and Technical Information of China (English)

    高国琴; 宋庆; 夏文娟

    2011-01-01

    It illustrates an application of intelligent control for a 2-DOF parallel robot in order to solve the problems of Parallel robot's high nonlinear ity, strong coupling and mathematical model complex, specific to the parallel robot mechanism with AC servo-motor drive-GPM-200 parallel mechanism.Under the conditions of model free without using the forward kinematics of the manipulatory three dimensional fuzzy PID controller is designed,which performance is compared with a linear PID controller.The dynamic simulation results based on the MATLAB demonstrate that fuzzy PID controller is shown to have better per-formance in tracking and higher robustness than the linear controller and can meet the requirement of par-allel robot for high precision and real time control.%为解决并联机器人高度非线性、强耦合、数学模型复杂的问题,针对交流伺服电机驱动的并联机器人,在模型不确定情况下提出了一种针对2-DOF并联机器人的智能控制方法,设计了一个二输入的模糊控制器,该控制方法不需要前向运动学的求解.通过模糊控制器对PID参数进行实时整定,基于MATLAB进行动态仿真,仿真结果表明:模糊PID在轨迹跟综和带负荷运动的稳定性方面比线性PID具有更好的控制效果,可实现并联机器人的高精度实时控制.

  11. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    Directory of Open Access Journals (Sweden)

    Xuhui Bu

    2012-01-01

    Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.

  12. Model-free constrained data-driven iterative reference input tuning algorithm with experimental validation

    Science.gov (United States)

    Radac, Mircea-Bogdan; Precup, Radu-Emil

    2016-05-01

    This paper presents the design and experimental validation of a new model-free data-driven iterative reference input tuning (IRIT) algorithm that solves a reference trajectory tracking problem as an optimization problem with control signal saturation constraints and control signal rate constraints. The IRIT algorithm design employs an experiment-based stochastic search algorithm to use the advantages of iterative learning control. The experimental results validate the IRIT algorithm applied to a non-linear aerodynamic position control system. The results prove that the IRIT algorithm offers the significant control system performance improvement by few iterations and experiments conducted on the real-world process and model-free parameter tuning.

  13. Model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-01-01

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence much video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm based on 2D lan

  14. Totally Model-Free Learned Skillful Coping

    Science.gov (United States)

    Dreyfus, Stuart E.

    2004-01-01

    The author proposes a neural-network-based explanation of how a brain might acquire intuitive expertise. The explanation is intended merely to be suggestive and lacks many complexities found in even lower animal brains. Yet significantly, even this simplified brain model is capable of explaining the acquisition of simple skills without developing…

  15. A novel model-free approach for reconstruction of time-delayed gene regulatory networks

    Institute of Scientific and Technical Information of China (English)

    JIANG; Wei; LI; Xia; GUO; Zheng; LI; Chuanxing; WANG; Lihong

    2006-01-01

    Reconstruction of genetic networks is one of the key scientific challenges in functional genomics. This paper describes a novel approach for addressing the regulatory dependencies between genes whose activities can be delayed by multiple units of time. The aim of the proposed approach termed TdGRN (time-delayed gene regulatory networking) is to reversely engineer the dynamic mechanisms of gene regulations, which is realized by identifying the time-delayed gene regulations through supervised decision-tree analysis of the newly designed time-delayed gene expression matrix, derived from the original time-series microarray data. A permutation technique is used to determine the statistical classification threshold of a tree, from which a gene regulatory rule(s) is extracted. The proposed TdGRN is a model-free approach that attempts to learn the underlying regulatory rules without relying on any model assumptions. Compared with model-based approaches, it has several significant advantages: it requires neither any arbitrary threshold for discretization of gene transcriptional values nor the definition of the number of regulators (k). We have applied this novel method to the publicly available data for budding yeast cell cycling. The numerical results demonstrate that most of the identified time-delayed gene regulations have current biological knowledge supports.

  16. Model-free control techniques for Stop & Go systems

    OpenAIRE

    Villagra, Jorge; Milanés, Vicente; Pérez Rastelli, Joshué; González, Carlos

    2010-01-01

    International audience; This paper presents a comparison of Stop & Go control algorithms, which deal with car following scenarios in urban environments. Since many vehicle/road interaction factors (road slope, aerodynamic forces) and actuator dynamics are very poorly known, two robust control strategies are proposed: an intelligent PID controller and a fuzzy controller. Both model-free techniques will be implemented and compared in simulation to show their suitability for demanding scenarios.

  17. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  18. Model-free adaptive control of advanced power plants

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  19. A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series

    Institute of Scientific and Technical Information of China (English)

    孙青华; 张世英; 梁雄健

    2003-01-01

    In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.

  20. Model-Free Trajectory Optimisation for Unmanned Aircraft Serving as Data Ferries for Widespread Sensors

    Directory of Open Access Journals (Sweden)

    Ben Pearre

    2012-10-01

    Full Text Available Given multiple widespread stationary data sources such as ground-based sensors, an unmanned aircraft can fly over the sensors and gather the data via a wireless link. Performance criteria for such a network may incorporate costs such as trajectory length for the aircraft or the energy required by the sensors for radio transmission. Planning is hampered by the complex vehicle and communication dynamics and by uncertainty in the locations of sensors, so we develop a technique based on model-free learning. We present a stochastic optimisation method that allows the data-ferrying aircraft to optimise data collection trajectories through an unknown environment in situ, obviating the need for system identification. We compare two trajectory representations, one that learns near-optimal trajectories at low data requirements but that fails at high requirements, and one that gives up some performance in exchange for a data collection guarantee. With either encoding the ferry is able to learn significantly improved trajectories compared with alternative heuristics. To demonstrate the versatility of the model-free learning approach, we also learn a policy to minimise the radio transmission energy required by the sensor nodes, allowing prolonged network lifetime.

  1. Policy improvement by a model-free Dyna architecture.

    Science.gov (United States)

    Hwang, Kao-Shing; Lo, Chia-Yue

    2013-05-01

    The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate. PMID:24808427

  2. Model-free Estimation of Recent Genetic Relatedness

    Science.gov (United States)

    Conomos, Matthew P.; Reiner, Alexander P.; Weir, Bruce S.; Thornton, Timothy A.

    2016-01-01

    Genealogical inference from genetic data is essential for a variety of applications in human genetics. In genome-wide and sequencing association studies, for example, accurate inference on both recent genetic relatedness, such as family structure, and more distant genetic relatedness, such as population structure, is necessary for protection against spurious associations. Distinguishing familial relatedness from population structure with genotype data, however, is difficult because both manifest as genetic similarity through the sharing of alleles. Existing approaches for inference on recent genetic relatedness have limitations in the presence of population structure, where they either (1) make strong and simplifying assumptions about population structure, which are often untenable, or (2) require correct specification of and appropriate reference population panels for the ancestries in the sample, which might be unknown or not well defined. Here, we propose PC-Relate, a model-free approach for estimating commonly used measures of recent genetic relatedness, such as kinship coefficients and IBD sharing probabilities, in the presence of unspecified structure. PC-Relate uses principal components calculated from genome-screen data to partition genetic correlations among sampled individuals due to the sharing of recent ancestors and more distant common ancestry into two separate components, without requiring specification of the ancestral populations or reference population panels. In simulation studies with population structure, including admixture, we demonstrate that PC-Relate provides accurate estimates of genetic relatedness and improved relationship classification over widely used approaches. We further demonstrate the utility of PC-Relate in applications to three ancestrally diverse samples that vary in both size and genealogical complexity. PMID:26748516

  3. Revisiting some practical issues in the implementation of model-free control

    OpenAIRE

    Fliess, Michel; Join, Cédric; Riachy, Samer

    2011-01-01

    International audience; This paper simplifies several aspects of the practical implementation of the newly introduced model-free control and of the corresponding "intelligent" PID controllers (M. Fliess, C. Join, "Model-free control and intelligent PID controllers: towards a possible trivialization of nonlinear control?," 15th IFAC Symp. System Identif, Saint-Malo, 2009. URL: http://hal.inria.fr/inria-00372325/en/). Four examples with their computer simulations permit to test our techniques.

  4. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  5. High-Frequency and Model-Free Volatility Estimators

    OpenAIRE

    Robert Ślepaczuk; Grzegorz Zakrzewski

    2009-01-01

    This paper focuses on volatility of financial markets, which is one of the most important issues in finance, especially with regard to modeling high-frequency data. Risk management, asset pricing and option valuation techniques are the areas where the concept of volatility estimators (consistent, unbiased and the most efficient) is of crucial concern. Our intention was to find the best estimator of true volatility taking into account the latest investigations in finance literature. Basing on ...

  6. Thermal characterization and model free kinetics of aged epoxies and foams using TGA and DSC methods.

    Energy Technology Data Exchange (ETDEWEB)

    Cordaro, Joseph Gabriel; Kruizenga, Alan Michael; Nissen, April

    2013-10-01

    Two classes of materials, poly(methylene diphenyl diisocyanate) or PMDI foam, and cross-linked epoxy resins, were characterized using thermal gravimetric analysis (TGA) and differential scanning calorimetry (DSC), to help understand the effects of aging and %E2%80%9Cbake-out%E2%80%9D. The materials were evaluated for mass loss and the onset of decomposition. In some experiments, volatile materials released during heating were analyzed via mass spectroscopy. In all, over twenty materials were evaluated to compare the mass loss and onset temperature for decomposition. Model free kinetic (MFK) measurements, acquired using variable heating rate TGA experiments, were used to calculate the apparent activation energy of thermal decomposition. From these compiled data the effects of aging, bake-out, and sample history on the thermal stability of materials were compared. No significant differences between aged and unaged materials were detected. Bake-out did slightly affect the onset temperature of decomposition but only at the highest bake-out temperatures. Finally, some recommendations for future handling are made.

  7. Isothermal Kinetics of the Pentlandite Exsolution from mss/Pyrrhotite Using Model-Free Method

    Institute of Scientific and Technical Information of China (English)

    WANG Haipeng

    2006-01-01

    The pentlandite exsolution from monosulfide solid solution (mss)/pyrrhotite exsolution is a complex multi-step process, including nucleation, new phase growth and atomic diffusion, and lamellae coarsening.Some of these steps occur in sequence, others simultaneously. These make its kinetic analysis difficult, as the mechanisms cannot be elucidated in detail. In mineral reactions of this type, the true functional form of the reaction model is almost never known, and the Arrhenius parameters determined by the classic Avrami method are skewed to compensate for errors in the model. The model-free kinetics allows a universal determination of activation energy. Kinetic study of pentlandite exsolution from mss/pyrrhotite was performed over the temperature range 200 to 300℃. For mss/pyrrhotite with bulk composition (Fe0.77Ni0.19)S, activation during the course of solid reaction with the extent of reaction. The surrounding environment of reactant atoms affects the atom's activity and more or less accounts for changes of activation energy Ea.

  8. Model-Free Adaptive Fuzzy Sliding Mode Controller Optimized by Particle Swarm for Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Amin Jalali

    2013-05-01

    Full Text Available The main purpose of this paper is to design a suitable control scheme that confronts the uncertainties in a robot. Sliding mode controller (SMC is one of the most important and powerful nonlinear robust controllers which has been applied to many non-linear systems. However, this controller has some intrinsic drawbacks, namely, the chattering phenomenon, equivalent dynamic formulation, and sensitivity to the noise. This paper focuses on applying artificial intelligence integrated with the sliding mode control theory. Proposed adaptive fuzzy sliding mode controller optimized by Particle swarm algorithm (AFSMC-PSO is a Mamdani’s error based fuzzy logic controller (FLS with 7 rules integrated with sliding mode framework to provide the adaptation in order to eliminate the high frequency oscillation (chattering and adjust the linear sliding surface slope in presence of many different disturbances and the best coefficients for the sliding surface were found by offline tuning Particle Swarm Optimization (PSO. Utilizing another fuzzy logic controller as an impressive manner to replace it with the equivalent dynamic part is the main goal to make the model free controller which compensate the unknown system dynamics parameters and obtain the desired control performance without exact information about the mathematical formulation of model.

  9. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  10. ANALYSIS-BASED SPARSE RECONSTRUCTION WITH SYNTHESIS-BASED SOLVERS

    OpenAIRE

    Cleju, Nicolae; Jafari, Maria,; Plumbley, Mark D.

    2012-01-01

    Analysis based reconstruction has recently been introduced as an alternative to the well-known synthesis sparsity model used in a variety of signal processing areas. In this paper we convert the analysis exact-sparse reconstruction problem to an equivalent synthesis recovery problem with a set of additional constraints. We are therefore able to use existing synthesis-based algorithms for analysis-based exact-sparse recovery. We call this the Analysis-By-Synthesis (ABS) approach. We evaluate o...

  11. Feature-based sentiment analysis with ontologies

    OpenAIRE

    Taner, Berk

    2011-01-01

    Sentiment analysis is a topic that many researchers work on. In recent years, new research directions under sentiment analysis appeared. Feature-based sentiment analysis is one such topic that deals not only with finding sentiment in a sentence but providing a more detailed analysis on a given domain. In the beginning researchers focused on commercial products and manually generated list of features for a product. Then they tried to generate a feature-based approach to attach sentiments to th...

  12. Pareto analysis based on records

    CERN Document Server

    Doostparast, M

    2012-01-01

    Estimation of the parameters of an exponential distribution based on record data has been treated by Samaniego and Whitaker (1986) and Doostparast (2009). Recently, Doostparast and Balakrishnan (2011) obtained optimal confidence intervals as well as uniformly most powerful tests for one- and two-sided hypotheses concerning location and scale parameters based on record data from a two-parameter exponential model. In this paper, we derive optimal statistical procedures including point and interval estimation as well as most powerful tests based on record data from a two-parameter Pareto model. For illustrative purpose, a data set on annual wages of a sample production-line workers in a large industrial firm is analyzed using the proposed procedures.

  13. Model-free control and intelligent PID controllers: towards a possible trivialization of nonlinear control?

    OpenAIRE

    Fliess, Michel; Join, Cédric

    2009-01-01

    This communication is a slightly modified and updated version of a paper by the same authors (Commande sans modèle et commande à modèle restreint, e-STA, vol. 5 (n° 4), pp. 1-23, 2008. Available at http://hal.inria.fr/inria-00288107/en/), which is written in French.; International audience; We are introducing a model-free control and a control with a restricted model for finite-dimensional complex systems. This control design may be viewed as a contribution to ``intelligent'' PID controllers,...

  14. A Model-Free No-arbitrage Price Bound for Variance Options

    Energy Technology Data Exchange (ETDEWEB)

    Bonnans, J. Frederic, E-mail: frederic.bonnans@inria.fr [Ecole Polytechnique, INRIA-Saclay (France); Tan Xiaolu, E-mail: xiaolu.tan@polytechnique.edu [Ecole Polytechnique, CMAP (France)

    2013-08-01

    We suggest a numerical approximation for an optimization problem, motivated by its applications in finance to find the model-free no-arbitrage bound of variance options given the marginal distributions of the underlying asset. A first approximation restricts the computation to a bounded domain. Then we propose a gradient projection algorithm together with the finite difference scheme to solve the optimization problem. We prove the general convergence, and derive some convergence rate estimates. Finally, we give some numerical examples to test the efficiency of the algorithm.

  15. Improved Frechet bounds and model-free pricing of multi-asset options

    CERN Document Server

    Tankov, Peter

    2010-01-01

    We compute the improved bounds on the copula of a bivariate random vector when partial information is available, such as the values of the copula on the subset of $[0,1]^2$, or the value of a functional of the copula, monotone with respect to the concordance order. These results are then used to compute model-free bounds on the prices of two-asset options which make use of extra information about the dependence structure, such as the price of another two-asset option.

  16. ROAn, a ROOT based Analysis Framework

    CERN Document Server

    Lauf, Thomas

    2013-01-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and...

  17. TEXTURE ANALYSIS BASED IRIS RECOGNITION

    OpenAIRE

    GÜRKAN, Güray; AKAN, Aydın

    2012-01-01

    In this paper, we present a new method for personal identification, based on iris patterns. The method composed of iris image acquisition, image preprocessing, feature extraction and finally decision stages. Normalized iris images are vertically log-sampled and filtered by circular symmetric Gabor filters. The output of filters are windowed and mean absolute deviation of pixels in the window are calculated as the feature vectors. The proposed  method has the desired properties of an iris reco...

  18. A unified model-free controller for switching minimum phase, non-minimum phase and time-delay systems

    CERN Document Server

    Michel, Loïc

    2012-01-01

    This preliminary work presents a simple derivation of the standard model-free control in order to control switching minimum phase, non-minimum phase and time-delay systems. The robustness of the proposed method is studied in simulation.

  19. Excel-Based Business Analysis

    CERN Document Server

    Anari, Ali

    2012-01-01

    ai"The trend is your friend"is a practical principle often used by business managers, who seek to forecast future sales, expenditures, and profitability in order to make production and other operational decisions. The problem is how best to identify and discover business trends and utilize trend information for attaining objectives of firms.This book contains an Excel-based solution to this problem, applying principles of the authors' "profit system model" of the firm that enables forecasts of trends in sales, expenditures, profits and other business variables. The program,

  20. Discrete-time dynamic graphical games:model-free reinforcement learning solution

    Institute of Scientific and Technical Information of China (English)

    Mohammed I ABOUHEAF; Frank L LEWIS; Magdi S MAHMOUD; Dariusz G MIKULSKI

    2015-01-01

    This paper introduces a model-free reinforcement learning technique that is used to solve a class of dynamic games known as dynamic graphical games. The graphical game results from multi-agent dynamical systems, where pinning control is used to make all the agents synchronize to the state of a command generator or a leader agent. Novel coupled Bellman equations and Hamiltonian functions are developed for the dynamic graphical games. The Hamiltonian mechanics are used to derive the necessary conditions for optimality. The solution for the dynamic graphical game is given in terms of the solution to a set of coupled Hamilton-Jacobi-Bellman equations developed herein. Nash equilibrium solution for the graphical game is given in terms of the solution to the underlying coupled Hamilton-Jacobi-Bellman equations. An online model-free policy iteration algorithm is developed to learn the Nash solution for the dynamic graphical game. This algorithm does not require any knowledge of the agents’ dynamics. A proof of convergence for this multi-agent learning algorithm is given under mild assumption about the inter-connectivity properties of the graph. A gradient descent technique with critic network structures is used to implement the policy iteration algorithm to solve the graphical game online in real-time.

  1. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  2. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  3. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  4. What Now? Some Brief Reflections on Model-Free Data Analysis

    OpenAIRE

    Richard Berk

    2009-01-01

    David Freedman’s critique of causal modeling in the social and biomedical sciences was fundamental. In his view, the enterprise was misguided, and there was no technical fix. Far too often, there was a disconnect between what the statistical methods required and the substantive information that could be brought to bear. In this paper, I briefly consider some alternatives to causal modeling assuming that David Freedman’s perspective on modeling is correct. In addition to randomized experiments...

  5. Data-Driven Control for Interlinked AC/DC Microgrids via Model-Free Adaptive Control and Dual-Droop Control

    DEFF Research Database (Denmark)

    Zhang, Huaguang; Zhou, Jianguo; Sun, Qiuye;

    2016-01-01

    terminal voltage and ac frequency. Moreover, the design of the controller is only based on input/output (I/O) measurement data but not the model any more, and the system stability can be guaranteed by the Lyapunov method. The detailed system architecture and proposed control strategies are presented in......: the primary outer-loop dual-droop control method along with secondary control; the inner-loop data-driven model-free adaptive voltage control. Using the proposed scheme, the interlinking converter, just like the hierarchical controlled DG units, will have the ability to regulate and restore the dc...

  6. Thermogravimetric and model-free kinetic studies on CO2 gasification of low-quality, high-sulphur Indian coals

    Indian Academy of Sciences (India)

    Tonkeswar Das; Ananya Saikia; Banashree Mahanta; Rahul Choudhury; Binoy K Saikia

    2016-10-01

    Coal gasification with CO$_2$ has emerged as a cleaner and more efficient way for the production of energy, and it offers the advantages of CO$_2$ mitigation policies through simultaneous CO$_2$ sequestration. In the present investigation, a feasibility study on the gasification of three low-quality, high-sulphur coals fromthe north-eastern region (NER) of India in a CO$_2$ atmosphere using thermogravimetric analysis (TGADTA) has been made in order to have a better understanding of the physical and chemical characteristics in the process of gasification of coal. Model-free kinetics was applied to determine the activation energies (E) and pre-exponential factors (A) of the CO$_2$ gasification process of the coals. Multivariate nonlinear regression analyses were performed to find out the formal mechanisms, kinetic model, and the corresponding kinetic triplets. The results revealed that coal gasification with CO$_2$ mainly occurs in the temperature range of 800◦–1400◦C and a maximum of at around 1100◦C. The reaction mechanisms responsible for CO$_2$ gasification of the coals were observed to be of the ‘nth order with autocatalysis (CnB)’ and ‘nth order (Fn) mechanism’. The activation energy of the CO$_2$ gasification was found to be in the range 129.07–146.81 kJ mol$^{−1}$.

  7. Analysis of Enhanced Associativity Based Routing Protocol

    Directory of Open Access Journals (Sweden)

    Said A. Shaar

    2006-01-01

    Full Text Available This study introduces an analysis to the performance of the Enhanced Associativity Based Routing protocol (EABR based on two factors; Operation complexity (OC and Communication Complexity (CC. OC can be defined as the number of steps required in performing a protocol operation, while CC can be defined as the number of messages exchanged in performing a protocol operation[1]. The values represent the worst-case analysis. The EABR has been analyzed based on CC and OC and the results have been compared with another routing technique called ABR. The results have shown that EABR can perform better than ABR in many circumstances during the route reconstruction.

  8. Analysis of a Chaotic Memristor Based Oscillator

    Directory of Open Access Journals (Sweden)

    F. Setoudeh

    2014-01-01

    Full Text Available A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT. Then, using Advance Design System (ADS software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of the paper.

  9. Analysis of a Chaotic Memristor Based Oscillator

    OpenAIRE

    F. Setoudeh; Khaki Sedigh, A.; Dousti, M

    2014-01-01

    A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT). Then, using Advance Design System (ADS) software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of t...

  10. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  11. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty

    2011-10-01

    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  12. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Nielsen, Mads; Lo, Pechin Chien Pau;

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based on...... subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density. The...

  13. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  14. Polyphase Order Analysis Based on Convolutional Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-06-01

    Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.

  15. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng

    2006-01-01

    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  16. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang

    2004-01-01

    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  17. Ontology-Based Analysis of Microarray Data.

    Science.gov (United States)

    Giuseppe, Agapito; Milano, Marianna

    2016-01-01

    The importance of semantic-based methods and algorithms for the analysis and management of biological data is growing for two main reasons. From a biological side, knowledge contained in ontologies is more and more accurate and complete, from a computational side, recent algorithms are using in a valuable way such knowledge. Here we focus on semantic-based management and analysis of protein interaction networks referring to all the approaches of analysis of protein-protein interaction data that uses knowledge encoded into biological ontologies. Semantic approaches for studying high-throughput data have been largely used in the past to mine genomic and expression data. Recently, the emergence of network approaches for investigating molecular machineries has stimulated in a parallel way the introduction of semantic-based techniques for analysis and management of network data. The application of these computational approaches to the study of microarray data can broad the application scenario of them and simultaneously can help the understanding of disease development and progress.

  18. Kinetic modelling of RDF pyrolysis: Model-fitting and model-free approaches.

    Science.gov (United States)

    Çepelioğullar, Özge; Haykırı-Açma, Hanzade; Yaman, Serdar

    2016-02-01

    In this study, refuse derived fuel (RDF) was selected as solid fuel and it was pyrolyzed in a thermal analyzer from room temperature to 900°C at heating rates of 5, 10, 20, and 50°C/min in N2 atmosphere. The obtained thermal data was used to calculate the kinetic parameters using Coats-Redfern, Friedman, Flylnn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) methods. As a result of Coats-Redfern model, decomposition process was assumed to be four independent reactions with different reaction orders. On the other hand, model free methods demonstrated that activation energy trend had similarities for the reaction progresses of 0.1, 0.2-0.7 and 0.8-0.9. The average activation energies were found between 73-161kJ/mol and it is possible to say that FWO and KAS models produced closer results to the average activation energies compared to Friedman model. Experimental studies showed that RDF may be a sustainable and promising feedstock for alternative processes in terms of waste management strategies. PMID:26613830

  19. Kinetic modelling of RDF pyrolysis: Model-fitting and model-free approaches.

    Science.gov (United States)

    Çepelioğullar, Özge; Haykırı-Açma, Hanzade; Yaman, Serdar

    2016-02-01

    In this study, refuse derived fuel (RDF) was selected as solid fuel and it was pyrolyzed in a thermal analyzer from room temperature to 900°C at heating rates of 5, 10, 20, and 50°C/min in N2 atmosphere. The obtained thermal data was used to calculate the kinetic parameters using Coats-Redfern, Friedman, Flylnn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) methods. As a result of Coats-Redfern model, decomposition process was assumed to be four independent reactions with different reaction orders. On the other hand, model free methods demonstrated that activation energy trend had similarities for the reaction progresses of 0.1, 0.2-0.7 and 0.8-0.9. The average activation energies were found between 73-161kJ/mol and it is possible to say that FWO and KAS models produced closer results to the average activation energies compared to Friedman model. Experimental studies showed that RDF may be a sustainable and promising feedstock for alternative processes in terms of waste management strategies.

  20. Examples of model-free implant restorations using Cerec inLab 4.0 software.

    Science.gov (United States)

    Reich, S; Schley, J; Kern, T; Fiedler, K; Wolfart, S

    2012-01-01

    This case report demonstrates two ways to fabricate model-free implant restorations with the Cerec inLab 4.0 software. Because the patient, a woman with a history of periodontal disease, did not wish to have a removable partial denture, implant therapy was planned for the restoration of her edentulous areas 14/15 and 24/25. In addition, the restoration was to provide functional relief of the natural maxillary anterior teeth. The two implants for the first quadrant were planned as single-tooth restorations. Each was designed as a full contour implant supra-structure using the Cerec Biogeneric abutment design technique. After completing the design phase, each restoration proposal was split into two parts: a zirconia abutment and a lithium disilicate crown. For the restoration of the second quadrant, custom 20-degree-angled abutments were individualized and acquired with the Cerec camera. A block crown was then designed, milled in burn-out acrylic resin, and fabricated from a lithium disilicate glass-ceramic ingot according to the press ceramic technique. Additionally methods of provisional restorations are discussed.

  1. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  2. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  3. TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING

    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie

    2003-01-01

    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  4. Quantum entanglement analysis based on abstract interpretation

    OpenAIRE

    Perdrix, Simon

    2008-01-01

    Entanglement is a non local property of quantum states which has no classical counterpart and plays a decisive role in quantum information theory. Several protocols, like the teleportation, are based on quantum entangled states. Moreover, any quantum algorithm which does not create entanglement can be efficiently simulated on a classical computer. The exact role of the entanglement is nevertheless not well understood. Since an exact analysis of entanglement evolution induces an exponential sl...

  5. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  6. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  7. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  8. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-01-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can correspond to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently. The proposed method is based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function φ (e.g. φ minimizing estimation errors. In order to avoid subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of φ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  9. Gait correlation analysis based human identification.

    Science.gov (United States)

    Chen, Jinyan

    2014-01-01

    Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x), vertical axis (y), and temporal axis (t). By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features' dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance. PMID:24592144

  10. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  11. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    Science.gov (United States)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  12. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  13. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.

    2016-01-01

    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  14. Arabic Interface Analysis Based on Cultural Markers

    Directory of Open Access Journals (Sweden)

    Mohammadi Akheela Khanum

    2012-01-01

    Full Text Available This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstedes cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstedes score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  15. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  16. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A

    2012-01-01

    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  17. Constructing storyboards based on hierarchical clustering analysis

    Science.gov (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  18. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  19. STELLAR LOCI II. A MODEL-FREE ESTIMATE OF THE BINARY FRACTION FOR FIELD FGK STARS

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Haibo; Liu, Xiaowei [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China); Xiang, Maosheng; Huang, Yang; Chen, Bingqiu [Department of Astronomy, Peking University, Beijing 100871 (China); Wu, Yue [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Hou, Yonghui; Zhang, Yong, E-mail: yuanhb4861@pku.edu.cn, E-mail: x.liu@pku.edu.cn [Nanjing Institute of Astronomical Optics and Technology, National Astronomical Observatories, Chinese Academy of Sciences, Nanjing 210042 (China)

    2015-02-01

    We propose a stellar locus outlier (SLOT) method to determine the binary fraction of main-sequence stars statistically. The method is sensitive to neither the period nor mass ratio distributions of binaries and is able to provide model-free estimates of binary fraction for large numbers of stars of different populations in large survey volumes. We have applied the SLOT method to two samples of stars from the Sloan Digital Sky Survey (SDSS) Stripe 82, constructed by combining the recalibrated SDSS photometric data with the spectroscopic information from the SDSS and LAMOST surveys. For the SDSS spectroscopic sample, we find an average binary fraction for field FGK stars of 41% ± 2%. The fractions decrease toward late spectral types and are 44% ± 5%, 43% ± 3%, 35% ± 5%, and 28% ± 6% for stars with g – i colors in the range 0.3-0.6 mag, 0.6-0.9 mag, 0.9-1.2 mag, and 1.2-1.6 mag, respectively. A modest metallicity dependence is also found. The fraction decreases with increasing metallicity. For stars with [Fe/H] between –0.5 and 0.0 dex, –1.0 and –0.5 dex, –1.5 and –1.0 dex, and –2.0 and –1.5 dex, the inferred binary fractions are 37% ± 3%, 39% ± 3%, 50% ± 9%, and 53% ± 20%, respectively. We have further divided the sample into stars from the thin disk, the thick disk, the transition zone between them, and the halo. The results suggest that the Galactic thin and thick disks have comparable binary fractions, whereas the Galactic halo contains a significantly larger fraction of binaries. Applying the method to the LAMOST spectroscopic sample yields consistent results. Finally, other potential applications and future work with the method are discussed.

  20. Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals.

    Directory of Open Access Journals (Sweden)

    Olav Stetter

    Full Text Available A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting. Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections

  1. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  2. Curvelet based offline analysis of SEM images.

    Directory of Open Access Journals (Sweden)

    Syed Hamad Shirazi

    Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.

  3. Glycocapture-based proteomics for secretome analysis.

    Science.gov (United States)

    Lai, Zon W; Nice, Edouard C; Schilling, Oliver

    2013-02-01

    Protein glycosylation represents the most abundant extracellular posttranslational modification in multicellular organisms. These glycoproteins unequivocally comprise the major biomolecules involved in extracellular processes, such as growth factors, signaling proteins for cellular communication, enzymes, and proteases for on- and off-site processing. It is now known that altered protein glycosylation is a hallmark event in many different pathologies. Glycoproteins are found mostly in the so-called secretome, which comprises classically and nonclassically secreted proteins and protein fragments that are released from the cell surface through ectodomain shedding. Due to biological complexity and technical difficulty, comparably few studies have taken an in-depth investigation of cellular secretomes using system-wide approaches. The cellular secretomes are considered to be a valuable source of therapeutic targets and novel biomarkers. It is not surprising that many existing biomarkers, including biomarkers for breast, ovarian, prostate, and colorectal cancers are glycoproteins. Focused analysis of secreted glycoproteins could thus provide valuable information for early disease diagnosis, and surveillance. Furthermore, since most secreted proteins are glycosylated and glycosylation predominantly targets secreted proteins, the glycan/sugar moiety itself can be used as a chemical "handle" for the targeted analysis of cellular secretomes, thereby reducing sample complexity and allowing detection of low abundance proteins in proteomic workflows. This review will focus on various glycoprotein enrichment strategies that facilitate proteomics-based technologies for the quantitative analysis of cell secretomes and cell surface proteomes.

  4. Watermark Resistance Analysis Based On Linear Transformation

    Directory of Open Access Journals (Sweden)

    N.Karthika Devi

    2012-06-01

    Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.

  5. Visual Similarity Based Document Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    Di Wen; Xiao-Qing Ding

    2006-01-01

    In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.

  6. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  7. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  8. Operating cost analysis of anaesthesia: Activity based costing (ABC analysis

    Directory of Open Access Journals (Sweden)

    Majstorović Branislava M.

    2011-01-01

    Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  9. Temporal expression-based analysis of metabolism.

    Directory of Open Access Journals (Sweden)

    Sara B Collins

    Full Text Available Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM. We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such "history-dependent" sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques.

  10. Temporal expression-based analysis of metabolism.

    Science.gov (United States)

    Collins, Sara B; Reznik, Ed; Segrè, Daniel

    2012-01-01

    Metabolic flux is frequently rerouted through cellular metabolism in response to dynamic changes in the intra- and extra-cellular environment. Capturing the mechanisms underlying these metabolic transitions in quantitative and predictive models is a prominent challenge in systems biology. Progress in this regard has been made by integrating high-throughput gene expression data into genome-scale stoichiometric models of metabolism. Here, we extend previous approaches to perform a Temporal Expression-based Analysis of Metabolism (TEAM). We apply TEAM to understanding the complex metabolic dynamics of the respiratorily versatile bacterium Shewanella oneidensis grown under aerobic, lactate-limited conditions. TEAM predicts temporal metabolic flux distributions using time-series gene expression data. Increased predictive power is achieved by supplementing these data with a large reference compendium of gene expression, which allows us to take into account the unique character of the distribution of expression of each individual gene. We further propose a straightforward method for studying the sensitivity of TEAM to changes in its fundamental free threshold parameter θ, and reveal that discrete zones of distinct metabolic behavior arise as this parameter is changed. By comparing the qualitative characteristics of these zones to additional experimental data, we are able to constrain the range of θ to a small, well-defined interval. In parallel, the sensitivity analysis reveals the inherently difficult nature of dynamic metabolic flux modeling: small errors early in the simulation propagate to relatively large changes later in the simulation. We expect that handling such "history-dependent" sensitivities will be a major challenge in the future development of dynamic metabolic-modeling techniques. PMID:23209390

  11. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  12. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;

    2014-01-01

    The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific overrepre......The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  13. A Translation Case Analysis Based on Skopos Theory

    Institute of Scientific and Technical Information of China (English)

    盖孟姣

    2015-01-01

    This paper is a translation case analysis based on Skopos Theory.This paper choose President Xi’s New Year congratulations of 2015 as analysis text and gives the case analysis.This paper focuses on translating the text based on Skopos Theory.

  14. Polyphase Order Analysis Based on Convolutional Approach

    OpenAIRE

    M. Drutarovsky

    1999-01-01

    The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing) rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational ...

  15. ANALYSIS OF CIRCUIT TOLERANCE BASED ON RANDOM SET THEORY

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Monte Carlo Analysis has been an accepted method for circuit tolerance analysis,but the heavy computational complexity has always prevented its applications.Based on random set theory,this paper presents a simple and flexible tolerance analysis method to estimate circuit yield.It is the alternative to Monte Carlo analysis,but reduces the number of calculations dramatically.

  16. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  17. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  18. Web Based Distributed Coastal Image Analysis System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  19. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates......Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...

  20. Design Intelligent Model-free Hybrid Guidance Controller for Three Dimension Motor

    Directory of Open Access Journals (Sweden)

    Abdol Majid Mirshekaran

    2014-10-01

    Full Text Available The minimum rule base Proportional Integral Derivative (PID Fuzzy hybrid guidance Controller for three dimensions spherical motor is presented in this research. A three dimensions spherical motor is well equipped with conventional control techniques and, in particular, various PID controllers which demonstrate a good performance and successfully solve different guidance problems. Guidance control in a three dimensions spherical motor is performed by the PID controllers producing the control signals which are applied to systems torque. The necessary reference inputs for a PID controller are usually supplied by the system's sensors based on different data. The popularity of PID Fuzzy hybrid guidance Controller can be attributed to their robust performance in a wide range of operating conditions and partly to their functional simplicity. PID methodology has three inputs and if any input is described with seven linguistic values, and any rule has three conditions we will need 343 rules. It is too much work to write 343 rules. In this research the PID-like fuzzy controller can be constructed as a parallel structure of a PD-like fuzzy controller and a conventional PI controller to have the minimum rule base. Linear type PID controller is used to modify PID fuzzy logic theory to design hybrid guidance methodology. This research is used to reduce or eliminate the fuzzy and conventional PID controller problem based on minimum rule base fuzzy logic theory and modified it by PID method to control of spherical motor system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  1. Pathway-Based Functional Analysis of Metagenomes

    Science.gov (United States)

    Bercovici, Sivan; Sharon, Itai; Pinter, Ron Y.; Shlomi, Tomer

    Metagenomic data enables the study of microbes and viruses through their DNA as retrieved directly from the environment in which they live. Functional analysis of metagenomes explores the abundance of gene families, pathways, and systems, rather than their taxonomy. Through such analysis researchers are able to identify those functional capabilities most important to organisms in the examined environment. Recently, a statistical framework for the functional analysis of metagenomes was described that focuses on gene families. Here we describe two pathway level computational models for functional analysis that take into account important, yet unaddressed issues such as pathway size, gene length and overlap in gene content among pathways. We test our models over carefully designed simulated data and propose novel approaches for performance evaluation. Our models significantly improve over current approach with respect to pathway ranking and the computations of relative abundance of pathways in environments.

  2. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  3. Market Based Analysis of Power System Interconnections

    OpenAIRE

    Obuševs, A; Turcik, M; Oļeiņikova, I; Junghāns, G

    2011-01-01

    Analysis in this Article is focused on usage of transmission grid under liberalized market with implicit transmission capacity allocation method, e.g. Nordic market. Attention is paid on fundamental changes in transmission utilization and its economical effective operation. For interconnection and power flow analysis and losses calculation model of Nordic grid was developed and transmission losses calculation method was created. Given approach will improve economical efficiency of system oper...

  4. Design Intelligent Model-free Hybrid Guidance Controller for Three Dimension Motor

    OpenAIRE

    Abdol Majid Mirshekaran; Farzin Piltan; Nasri Sulaiman; Alireza Siahbazi; Ali Barzegar; Mahmood Vosoogh

    2014-01-01

    The minimum rule base Proportional Integral Derivative (PID) Fuzzy hybrid guidance Controller for three dimensions spherical motor is presented in this research. A three dimensions spherical motor is well equipped with conventional control techniques and, in particular, various PID controllers which demonstrate a good performance and successfully solve different guidance problems. Guidance control in a three dimensions spherical motor is performed by the PID controllers producing the control ...

  5. Gender-Based Analysis On-Line Dialogue. Final Report.

    Science.gov (United States)

    2001

    An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with special…

  6. Science Based Governance? EU Food Regulation Submitted to Risk Analysis

    NARCIS (Netherlands)

    Szajkowska, A.; Meulen, van der B.M.J.

    2014-01-01

    Anna Szajkowska and Bernd van der Meulen analyse in their contribution, Science Based Governance? EU Food Regulation Submitted to Risk Analysis, the scope of application of risk analysis and the precautionary principle in EU food safety regulation. To what extent does this technocratic, science-base

  7. Transect based analysis versus area based analysis to quantify shoreline displacement: spatial resolution issues.

    Science.gov (United States)

    Anfuso, Giorgio; Bowman, Dan; Danese, Chiara; Pranzini, Enzo

    2016-10-01

    Field surveys, aerial photographs, and satellite images are the most commonly employed sources of data to analyze shoreline position, which are further compared by area based analysis (ABA) or transect based analysis (TBA) methods. The former is performed by computing the mean shoreline displacement for the identified coastal segments, i.e., dividing the beach area variation by the segment length; the latter is based on the measurement of the distance between each shoreline at set points along transects. The present study compares, by means of GIS tools, the ABA and TBA methods by computing shoreline displacements recorded on two stretches of the Tuscany coast (Italy): the beaches of Punta Ala, a linear coast without shore protection structures, and the one at Follonica, which is irregular due to the presence of groins and detached breakwaters. Surveys were carried out using a differential global positioning system (DGPS) in RTK mode. For each site, a 4800-m-long coastal segment was analyzed and divided into ninety-six 50-m-long sectors for which changes were computed using both the ABA and TBA methods. Sectors were progressively joined to have a length of 100, 200, 400, and 800 m to examine how this influenced results. ABA and TBA results are highly correlated for transect distance and sector length up to 100 m at both investigated locations. If longer transects are considered, the two methods still produce good correlated data on the smooth shoreline (i.e. at Punta Ala), but correlation became significantly lower on the irregular shoreline (i.e., at Follonica). PMID:27640163

  8. SSI Analysis for Base-Isolated Nuclear Power Plants

    International Nuclear Information System (INIS)

    Safety of NPPs much higher than other structures is required. An earthquake is one of the most important parameters which govern safety of NPPs among external events. Application of base isolation system for NPPs can reduce the risk for earthquakes. At present, a soil structure interaction(SSI) analysis is essential in seismic design of NPPs in consideration of ground structure interaction. In the seismic analysis of the base-isolated NPP, it is restrictive to consider nonlinear properties of seismic isolation bearings due to linear analysis of SSI analysis programs such as SASSI. Thus, in this study, SSI analyses are performed using an iterative approach considering material nonlinearity of isolators. By performing the SSI analysis using an iterative approach, nonlinear properties of isolators can be considered. The results of the SSI analysis show that the response of the base-isolated NPP with base isolation systems is significantly reduced horizontally

  9. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    1994-01-01

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomia...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits.......Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...

  10. Antenna trajectory error analysis in backprojection-based SAR images

    Science.gov (United States)

    Wang, Ling; Yazıcı, Birsen; Yanik, H. Cagri

    2014-06-01

    We present an analysis of the positioning errors in Backprojection (BP)-based Synthetic Aperture Radar (SAR) images due to antenna trajectory errors for a monostatic SAR traversing a straight linear trajectory. Our analysis is developed using microlocal analysis, which can provide an explicit quantitative relationship between the trajectory error and the positioning error in BP-based SAR images. The analysis is applicable to arbitrary trajectory errors in the antenna and can be extended to arbitrary imaging geometries. We present numerical simulations to demonstrate our analysis.

  11. Environmentally based Cost-Benefit Analysis

    International Nuclear Information System (INIS)

    The fundamentals of the basic elements of a new comprehensive economic assessment, MILA, developed in Sweden with inspiration from the Total Cost Assessment-model are presented. The core of the MILA approach is an expanded cost and benefit inventory. But MILA also includes a complementary addition of an internal waste stream analysis, a tool for evaluation of environmental conflicts in monetary terms, an extended time horizon and direct allocation of costs and revenues to products and processes. However, MILA does not ensure profitability for environmentally sound projects. Essentially, MILA is an approach of refining investment and profitability analysis of a project, investment or product. 109 refs., 38 figs

  12. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels;

    2010-01-01

    for implementation into a computer aided reasoning tool for HAZOP studies to perform root cause and consequence analysis. Such a tool will facilitate finding causes far away from the site of the deviation. A Functional HAZOP Assistant is proposed and investigated in a HAZOP study of an industrial scale Indirect...

  13. Model-free information-theoretic approach to infer leadership in pairs of zebrafish

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  14. Pathway-based Analysis Tools for Complex Diseases: A Review

    Directory of Open Access Journals (Sweden)

    Lv Jin

    2014-10-01

    Full Text Available Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods—the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  15. Encounter-based worms: Analysis and Defense

    CERN Document Server

    Tanachaiwiwat, Sapon

    2007-01-01

    Encounter-based network is a frequently-disconnected wireless ad-hoc network requiring immediate neighbors to store and forward aggregated data for information disseminations. Using traditional approaches such as gateways or firewalls for deterring worm propagation in encounter-based networks is inappropriate. We propose the worm interaction approach that relies upon automated beneficial worm generation aiming to alleviate problems of worm propagations in such networks. To understand the dynamic of worm interactions and its performance, we mathematically model worm interactions based on major worm interaction factors including worm interaction types, network characteristics, and node characteristics using ordinary differential equations and analyze their effects on our proposed metrics. We validate our proposed model using extensive synthetic and trace-driven simulations. We find that, all worm interaction factors significantly affect the pattern of worm propagations. For example, immunization linearly decrea...

  16. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes

    Science.gov (United States)

    Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G.

    2016-01-01

    Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367

  17. A model-free definition of coupling strength for assessing the influence between climatic processes

    Science.gov (United States)

    Runge, J.; Kurths, J.

    2012-04-01

    Assessing the strength of influence between climatic processes from observational data is an important problem on the way to construct conceptual models or make predictions. An example being the influence of ENSO on the Indian Monsoon compared to the influence of other climatic processes. It is an especially difficult task if the interactions are nonlinear where linear measures like the Pearson correlation coefficient fail. Apart from nonlinearity, auto-dependencies in the processes can lead to misleading high values of coupling strength. There exist statistical methods that address these issues, but most of them assume some model, e.g., a linear model in the case of the partial correlation. We propose a measure based on conditional mutual information that makes no assumptions on the underlying model and is able to exclude auto-dependencies and even influences of external processes. We investigate how the strength measured relates to model systems where a coupling strength is known and discuss its limitations. The measure is applied to time series of different climate indices and gridded data sets to gain insights into the coupling strength between climatic teleconnections. Applied to more than two time series it is also able to shed light on mechanisms of interactions between multiple processes.

  18. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Science.gov (United States)

    Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G

    2016-01-01

    Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367

  19. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Directory of Open Access Journals (Sweden)

    Elena Daskalaki

    Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.

  20. Scatter to volume registration for model-free respiratory motion estimation from dynamic MRIs.

    Science.gov (United States)

    Miao, S; Wang, Z J; Pan, L; Butler, J; Moran, G; Liao, R

    2016-09-01

    Respiratory motion is one major complicating factor in many image acquisition applications and image-guided interventions. Existing respiratory motion estimation and compensation methods typically rely on breathing motion models learned from certain training data, and therefore may not be able to effectively handle intra-subject and/or inter-subject variations of respiratory motion. In this paper, we propose a respiratory motion compensation framework that directly recovers motion fields from sparsely spaced and efficiently acquired dynamic 2-D MRIs without using a learned respiratory motion model. We present a scatter-to-volume deformable registration algorithm to register dynamic 2-D MRIs with a static 3-D MRI to recover dense deformation fields. Practical considerations and approximations are provided to solve the scatter-to-volume registration problem efficiently. The performance of the proposed method was investigated on both synthetic and real MRI datasets, and the results showed significant improvements over the state-of-art respiratory motion modeling methods. We also demonstrated a potential application of the proposed method on MRI-based motion corrected PET imaging using hybrid PET/MRI. PMID:27180910

  1. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  2. Symbolic Analysis of OTRAs-Based Circuits

    Directory of Open Access Journals (Sweden)

    C. Sánchez-López

    2011-04-01

    Full Text Available A new nullor-based model to describe the behavior of Operational Transresistance Amplifiers (OTRAs is introduced.The new model is composed of four nullors and three grounded resistors. As a consequence, standard nodal analysiscan be applied to compute fully-symbolic small-signal characteristics of OTRA-based analog circuits, and the nullorbasedOTRAs model can be used in CAD tools. In this manner, the fully-symbolic transfer functions of severalapplication circuits, such as filters and oscillators can easily be approximated.

  3. Structural Analysis of Plate Based Tensegrity Structures

    DEFF Research Database (Denmark)

    Hald, Frederik; Kirkegaard, Poul Henning; Damkilde, Lars

    2013-01-01

    Plate tensegrity structures combine tension cables with a cross laminated timber plate and can then form e.g. a roof structure. The topology of plate tensegrity structures is investigated through a parametric investigation. Plate tensegrity structures are investigated, and a method for...... determination of the structures pre-stresses is used. A parametric investigation is performed to determine a more optimized form of the plate based tensegrity structure. Conclusions of the use of plate based tensegrity in civil engineering and further research areas are discussed....

  4. Crime prevention: more evidence-based analysis.

    Science.gov (United States)

    Garrido Genovés, Vicente; Farrington, David P; Welsh, Brandon C

    2008-02-01

    This paper introduces a new section of Psicothema dedicated to the evidence-based approach to crime prevention. Along with an original sexual-offender-treatment programme implemented in Spain, this section presents four systematic reviews of important subjects in the criminological arena, such as sexual offender treatment, the well-known programme, the effectiveness of custodial versus non-custodial sanctions in reoffending and the fight against terrorism. We also highlight some of the focal points that scientists, practitioners and governments should take into account in order to support this evidence-based viewpoint of crime prevention.

  5. Thanatophoric dysplasia: case-based bioethical analysis

    OpenAIRE

    Edgar Abarca López; Alejandra Rodríguez Torres; Donovan Casas Patiño; Esteban Espíndola Benítez

    2014-01-01

    This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  6. Thanatophoric dysplasia: case-based bioethical analysis

    Directory of Open Access Journals (Sweden)

    Edgar Abarca López

    2014-04-01

    Full Text Available This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  7. Movement Pattern Analysis Based on Sequence Signatures

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Chavoshi

    2015-09-01

    Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.

  8. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.;

    2008-01-01

    This review with 239 references has as its aim to give the reader an introduction to the kinds of methods used for developing microchip based electrode systems as well as to cover the existing literature on electroanalytical systems where microchips play a crucial role for 'nondestructive...

  9. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    OpenAIRE

    Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo

    2003-01-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...

  10. Desiccant-Based Preconditioning Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    A number of important conclusions can be drawn as a result of this broad, first-phase market evaluation. The more important conclusions include the following: (1) A very significant market opportunity will exist for specialized outdoor air-handling units (SOAHUs) as more construction and renovation projects are designed to incorporate the recommendations made by the ASHRAE 62-1989 standard. Based on this investigation, the total potential market is currently $725,000,000 annually (see Table 6, Sect. 3). Based on the market evaluations completed, it is estimated that approximately $398,000,000 (55%) of this total market could be served by DBC systems if they were made cost-effective through mass production. Approximately $306,000,000 (42%) of the total can be served by a non-regenerated, desiccant-based total recovery approach, based on the information provided by this investigation. Approximately $92,000,000 (13%) can be served by a regenerated desiccant-based cooling approach (see Table 7, Sect. 3). (2) A projection of the market selling price of various desiccant-based SOAHU systems was prepared using prices provided by Trane for central-station, air-handling modules currently manufactured. The wheel-component pricing was added to these components by SEMCO. This resulted in projected pricing for these systems that is significantly less than that currently offered by custom suppliers (see Table 4, Sect. 2). Estimated payback periods for all SOAHU approaches were quite short when compared with conventional over-cooling and reheat systems. Actual paybacks may vary significantly depending on site-specific considerations. (3) In comparing cost vs benefit of each SOAHU approach, it is critical that the total system design be evaluated. For example, the cost premium of a DBC system is very significant when compared to a conventional air handling system, yet the reduced chiller, boiler, cooling tower, and other expense often equals or exceeds this premium, resulting in a

  11. Google glass based immunochromatographic diagnostic test analysis

    Science.gov (United States)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  12. Web Application Comprehension Based on Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jun-hua; XU Bao-wen; JIANG Ji-xiang

    2004-01-01

    Many research indicate a lot of money and time are spent on maintaining and modifying program delivered.So the policies to support program comprehension are very important.Program comprehension is a crucial and difficult task.Insufficient design, illogical code structure, short documents will enhance the comprehensive difficulty.Developing Web application is usually a process with quick implementation and delivery.In addition, generally a Web application is coded by combining mark language statements with some embedded applets.Such programming mode affects comprehension of Web applications disadvantageously.This paper proposes a method to improving understanding Web by dependence analysis and slice technology.

  13. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  14. An SQL-based approach to physics analysis

    Science.gov (United States)

    Limper, Maaike, Dr

    2014-06-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced "ROOT-ntuple" files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  15. Analysis and Protection of SIP based Services

    OpenAIRE

    Ferdous, Raihana

    2014-01-01

    Multimedia communications over IP are booming as they offer higher flexibility and more features than traditional voice and video services. IP telephony known as Voice over IP (VoIP) is one of the commercially most important emerging trends in multimedia communications over IP. Due to the flexibility and descriptive power, the Session Initiation Protocol (SIP) is becoming the root of many sessions-based applications such as VoIP and media streaming that are used by a growing number of use...

  16. Value-Based Analysis of Mobile Tagging

    OpenAIRE

    Oguzhan Aygoren; Kaan Varnali

    2011-01-01

    Innovative use of the mobile medium in delivering customer value presents unprecedented opportunities for marketers. Various types of mobile applications have evolved to provide ubiquitous and instant customer service to capitalize on this opportunity. One application is mobile tagging, a mobile-based innovative tool for convergence marketing. The accumulated academic knowledge on mobile marketing lacks consumer-centric information about this phenomenon. This paper addresses this issue and co...

  17. Analysis of Hashrate-Based Double Spending

    OpenAIRE

    Rosenfeld, Meni

    2014-01-01

    Bitcoin is the world's first decentralized digital currency. Its main technical innovation is the use of a blockchain and hash-based proof of work to synchronize transactions and prevent double-spending the currency. While the qualitative nature of this system is well understood, there is widespread confusion about its quantitative aspects and how they relate to attack vectors and their countermeasures. In this paper we take a look at the stochastic processes underlying typical attacks and th...

  18. Interest Based Financial Intermediation: Analysis and Solutions

    OpenAIRE

    Shaikh, Salman

    2012-01-01

    Interest is prohibited in all monotheist religions. Apart from religion, interest is also regarded as unjust price of money capital by pioneer secular philosophers as well as some renowned economists. However, it is argued by some economists that modern day, market driven interest rate in a competitive financial market is different from usury and that the interest based financial intermediation has served a useful purpose in allocation of resources as well as in allocation of risk, given the ...

  19. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  20. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  1. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  2. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  3. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  4. Mental EEG Analysis Based on Infomax Algorithm

    Institute of Scientific and Technical Information of China (English)

    WUXiao-pei; GuoXiao-jing; ZANGDao-xin; SHENQian

    2004-01-01

    The patterns of EEG will change with mental tasks performed by the subject. In the field of EEG signal analysis and application, the study to get the patterns of mental EEG and then to use them to classify mental tasks has the significant scientific meaning and great application value. But for the reasons of different artifacts existing in EEG, the pattern detection of EEG under normal mental states is a very difficult problem. In this paper, Independent Component Analysisis applied to EEG signals collected from performing different mental tasks. The experiment results show that when one subject performs a single mental task in different trials, the independent components of EEG are very similar. It means that the independent components can be used as the mental EEG patterns to classify the different mental tasks.

  5. Wavelet Based Fractal Analysis of Airborne Pollen

    CERN Document Server

    Degaudenzi, M E

    1999-01-01

    The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen time series indicate that the system can be described by a low dimensional chaotic map. We apply the wavelet transform to study the multifractal characteristics of an a irborne pollen time series. We find the persistence behaviour associated to low pollen concentration values and to the most rare events of highest pollen co ncentration values. The information and the correlation dimensions correspond to a chaotic system showing loss of information with time evolution.

  6. Computational based functional analysis of Bacillus phytases.

    Science.gov (United States)

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  7. Face Recognition Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2013-02-01

    Full Text Available The purpose of the proposed research work is to develop a computer system that can recognize a person by comparing the characteristics of face to those of known individuals. The main focus is on frontal two dimensional images that are taken in a controlled environment i.e. the illumination and the background will be constant. All the other methods of person’s identification and verification like iris scan or finger print scan require high quality and costly equipment’s but in face recognition we only require a normal camera giving us a 2-D frontal image of the person that will be used for the process of the person’s recognition. Principal Component Analysis technique has been used in the proposed system of face recognition. The purpose is to compare the results of the technique under the different conditions and to find the most efficient approach for developing a facial recognition system

  8. LEARNING DIFFICULTIES: AN ANALYSIS BASED ON VIGOTSKY

    Directory of Open Access Journals (Sweden)

    Adriane Cenci

    2010-06-01

    Full Text Available We aimed, along the text, to bring a reflection upon learning difficulties based on Socio-Historical Theory, relating what is observed in schools to what has been discussed about learning difficulties and the theory proposed by Vygotsky in the early XX century. We understand that children enter school carrying experiences and knowledge from their cultural group and that school ignores such knowledge very often. Then, it is in such disengagement that emerges what we started to call learning difficulties. One cannot forget to see a child as a whole – a student is a social being constituted by culture, language and specific values to which one must be attentive.

  9. Building Extraction from LIDAR Based Semantic Analysis

    Institute of Scientific and Technical Information of China (English)

    YU Jie; YANG Haiquan; TAN Ming; ZHANG Guoning

    2006-01-01

    Extraction of buildings from LIDAR data has been an active research field in recent years. A scheme for building detection and reconstruction from LIDAR data is presented with an object-oriented method which is based on the buildings' semantic rules. Two key steps are discussed: how to group the discrete LIDAR points into single objects and how to establish the buildings' semantic rules. In the end, the buildings are reconstructed in 3D form and three common parametric building models (flat, gabled, hipped) are implemented.

  10. Digital Terrain Analysis Based on DEM

    Institute of Scientific and Technical Information of China (English)

    Bi Huaxing; Li Xiaoyin; Guo Mengxia; Liu Xin; Li Jun

    2006-01-01

    The digital elevation model (DEM),an important source of information,is usually used to express a topographic surface in three dimensions and to imitate essential natural geography.DEM has been applied to physical geography,hydrology,ecology,and biology.This study analyzed digital elevation data sources and their structure,the arithmetic of terrain attribute extraction from DEM and its applications,and DEM's error and uncertainty algorithm.The Hayachinesan mountain area (in northeastern Japan) was chosen as research site,and the focus was on terrain analysis and the impacts of DEM resolution on topographic attributes,analyzed using TNTmips GIS software (MicroImage,Inc.,USA) and"Digital Map 25,000"(published by the Geographical Survey Institute of Japan in 1998).The results show that:(1) DEM is a very effective tool for terrain analysis:many terrain attributes (such as slope,aspect,slope type,watershed,and standard flow path) can be derived,and these attributes can be displayed with both image and attribute databases,with the help of GIS;(2) DEM resolution has a great influence on terrain attributes.The following details are shown:(a) DEM resolution has a significant effect on slope estimation:the average slope becomes smaller and the standard deviation becomes larger when DEM resolution changes from fine to coarse,and the different impacts of DEM resolution on different slope ranges can be classified into three gradient classes:0-10°(underestimated slope),10-35°(overestimated slope),and>35°(little impact on slope estimation);(b) DEM resolution has little effect on aspect estimation,but fiat areas become larger when DEM resolution changes from Fine to coarse;and (c) the quantity of hydrologic topography information declines as DEM resolution decreases.

  11. Trajectory Based Behavior Analysis for User Verification

    Science.gov (United States)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  12. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  13. The Route Analysis Based On Flight Plan

    Science.gov (United States)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  14. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    Science.gov (United States)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  15. Bayesian analysis for EMP damaged function based on Weibull distribution

    International Nuclear Information System (INIS)

    Weibull distribution is one of the most commonly used statistical distribution in EMP vulnerability analysis. In the paper, the EMP damage function based on Weibull distribution of solid state relays was solved by bayesian computation using gibbs sampling algorithm. (authors)

  16. Sentiment Analysis of Document Based on Annotation

    CERN Document Server

    Shukla, Archana

    2011-01-01

    I present a tool which tells the quality of document or its usefulness based on annotations. Annotation may include comments, notes, observation, highlights, underline, explanation, question or help etc. comments are used for evaluative purpose while others are used for summarization or for expansion also. Further these comments may be on another annotation. Such annotations are referred as meta-annotation. All annotation may not get equal weightage. My tool considered highlights, underline as well as comments to infer the collective sentiment of annotators. Collective sentiments of annotators are classified as positive, negative, objectivity. My tool computes collective sentiment of annotations in two manners. It counts all the annotation present on the documents as well as it also computes sentiment scores of all annotation which includes comments to obtain the collective sentiments about the document or to judge the quality of document. I demonstrate the use of tool on research paper.

  17. Analysis of Vehicle-Based Security Operations

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jason M [ORNL; Paul, Nate R [ORNL

    2015-01-01

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be

  18. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  19. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  20. Model-free CPPI

    OpenAIRE

    Alexander Schied

    2013-01-01

    We consider Constant Proportion Portfolio Insurance (CPPI) and its dynamic extension, which may be called Dynamic Proportion Portfolio Insurance (DPPI). It is shown that these investment strategies work within the setting of F\\"ollmer's pathwise It\\^o calculus, which makes no probabilistic assumptions whatsoever. This shows, on the one hand, that CPPI and DPPI are completely independent of any choice of a particular model for the dynamics of asset prices. They even make sense beyond the class...

  1. Multilevel Solvers with Aggregations for Voxel Based Analysis of Geomaterials

    OpenAIRE

    Blaheta, R. (Radim); V. Sokol

    2012-01-01

    Our motivation for voxel based analysis comes from the investigation of geomaterials (geocomposites) arising from rock grouting or sealing. We use finite element analysis based on voxel data from tomography. The arising finite element systems are large scale, which motivates the use of multilevel iterative solvers or preconditioners. Among others we concentrate on multilevel Schwarz preconditioners with aggregations. The aggregations are efficient even in the case of problems with hete...

  2. Experimental Bifurcation Analysis Using Control-Based Continuation

    DEFF Research Database (Denmark)

    Bureau, Emil; Starke, Jens

    The focus of this thesis is developing and implementing techniques for performing experimental bifurcation analysis on nonlinear mechanical systems. The research centers around the newly developed control-based continuation method, which allows to systematically track branches of stable and unsta......The focus of this thesis is developing and implementing techniques for performing experimental bifurcation analysis on nonlinear mechanical systems. The research centers around the newly developed control-based continuation method, which allows to systematically track branches of stable...

  3. Architecture Level Dependency Analysis of SOA Based System through ?-Adl

    OpenAIRE

    Pawan Kumar; Ratneshwer

    2016-01-01

    A formal Architecture Description Language (ADL) provides an effective way to dependency analysis at early stage of development. ?-ADL is an ADL that represents the static and dynamic features of software services. In this paper, we describe an approach of dependency analysis of SOA (Service Oriented Architecture) based system, at architecture level, through ?-ADL. A set of algorithms are also proposed for identification of dependency relationships from a SOA based system. The proposed algori...

  4. Product Profitability Analysis Based on EVA and ABC

    OpenAIRE

    Chen Lin; Shuangyuan Wang; Zhilin Qiao

    2013-01-01

    On the purpose of maximizing shareholders’ value, the profitability analysis established on the basis oftraditional accounting earnings cannot meet the demands of providing accurate decision-making information forenterprises. Therefore, this paper implements the Activity Based Costing (ABC) and the Economic Value Added(EVA) into the traditional profitability analysis system, sets up an improved EVA-ABC based profitabilityanalysis system as well as its relative indexes, and applies it to the s...

  5. Empirical validation and comparison of models for customer base analysis

    OpenAIRE

    Persentili Batislam, Emine; Denizel, Meltem; Filiztekin, Alpay

    2007-01-01

    The benefits of retaining customers lead companies to search for means to profile their customers individually and track their retention and defection behaviors. To this end, the main issues addressed in customer base analysis are identification of customer active/inactive status and prediction of future purchase levels. We compare the predictive performance of Pareto/NBD and BG/NBD models from the customer base analysis literature — in terms of repeat purchase levels and active status — usi...

  6. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  7. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  8. Web-Based Trainer for Electrical Circuit Analysis

    Science.gov (United States)

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  9. PYTHON-based Physics Analysis Environment for LHCb

    CERN Document Server

    Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E

    2004-01-01

    BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.

  10. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  11. Benefits of Computer Based Content Analysis to Foresight

    OpenAIRE

    Kováříková, Ludmila; Grosová, Stanislava

    2014-01-01

    Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...

  12. Surveillance data bases, analysis, and standardization program

    Energy Technology Data Exchange (ETDEWEB)

    Kam, F.B.K.

    1990-09-26

    The traveler presented a paper at the Seventh ASTM-EURATOM Symposium on Reactor Dosimetry and co-chaired an oral session on Computer Codes and Methods. Papers of considerable interest to the NRC Surveillance Dosimetry Program involved statistically based adjustment procedures and uncertainties. The information exchange meetings with Czechoslovakia and Hungary were very enlightening. Lack of large computers have hindered their surveillance program. They depended very highly on information from their measurement programs which were somewhat limited because of the lack of sophisticated electronics. The Nuclear Research Institute at Rez had to rely on expensive mockups of power reactor configurations to test their fluence exposures. Computers, computer codes, and updated nuclear data would advance their technology rapidly, and they were not hesitant to admit this fact. Both eastern-bloc countries said that IBM is providing an IBM 3090 for educational purposes but research and development studies would have very limited access. They were very apologetic that their currencies were not convertible, and any exchange means that they could provide services or pay for US scientists in their respective countries, but funding for their scientists in the United States, or expenses that involved payment in dollars, must come from us.

  13. Indoor air quality analysis based on Hadoop

    International Nuclear Information System (INIS)

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper

  14. Indoor air quality analysis based on Hadoop

    Science.gov (United States)

    Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui

    2014-03-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.

  15. Some Linguistic-based and temporal analysis on Wikipedia

    International Nuclear Information System (INIS)

    Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)

  16. Preprocessing and Analysis of LC-MS-Based Proteomic Data.

    Science.gov (United States)

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed.

  17. Fatigue analysis of steam generator cassette parts based on CAE

    International Nuclear Information System (INIS)

    Fatigue analysis has been performed for steam generator nozzle header and tube based on CAE. Three dimensional model was produced using the commercial CAD program, IDEAS and the geometry and boundary condition information have been transformed into input format of ABAQUS for thermal analysis, stress analysis, and fatigue analysis. Cassette nozzle, which has a complex geometry, has been analysed by using the three dimensional model. But steam generator tube has been analysed according to ASME procedure since it can be modelled as a two dimensional finite element model. S-N curve for the titanium alloy of the steam generator tube material was obtained from the material tests. From the analysis, it has been confirmed that these parts of the steam generator cassette satisfy the lifetime of the steam generator cassette. Three dimensional modelling strategy from the thermal analysis to fatigue analysis should be implemented into the design of reactor major components to enhance the efficiency of design procedure

  18. Agent-based analysis of organizations : formalization and simulation

    OpenAIRE

    Dignum, M.V.; Tick, C.

    2008-01-01

    Organizational effectiveness depends on many factors, including individual excellence, efficient structures, effective planning and capability to understand and match context requirements. We propose a way to model organizational performance based on a combination of formal models and agent-based simulation that supports the analysis of the congruence of different organizational structures to changing environments

  19. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research.

    Science.gov (United States)

    Li, Junyi; Li, Yi-Xue; Li, Yuan-Yuan

    2016-01-01

    With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA) based on gene coexpression network (GCN) increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies. PMID:27597964

  20. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research

    Directory of Open Access Journals (Sweden)

    Junyi Li

    2016-01-01

    Full Text Available With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA based on gene coexpression network (GCN increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies.

  1. Kernel-Based Nonlinear Discriminant Analysis for Face Recognition

    Institute of Scientific and Technical Information of China (English)

    LIU QingShan (刘青山); HUANG Rui (黄锐); LU HanQing (卢汉清); MA SongDe (马颂德)

    2003-01-01

    Linear subspace analysis methods have been successfully applied to extract features for face recognition. But they are inadequate to represent the complex and nonlinear variations of real face images, such as illumination, facial expression and pose variations, because of their linear properties. In this paper, a nonlinear subspace analysis method, Kernel-based Nonlinear Discriminant Analysis (KNDA), is presented for face recognition, which combines the nonlinear kernel trick with the linear subspace analysis method - Fisher Linear Discriminant Analysis (FLDA).First, the kernel trick is used to project the input data into an implicit feature space, then FLDA is performed in this feature space. Thus nonlinear discriminant features of the input data are yielded. In addition, in order to reduce the computational complexity, a geometry-based feature vectors selection scheme is adopted. Another similar nonlinear subspace analysis is Kernel-based Principal Component Analysis (KPCA), which combines the kernel trick with linear Principal Component Analysis (PCA). Experiments are performed with the polynomial kernel, and KNDA is compared with KPCA and FLDA. Extensive experimental results show that KNDA can give a higher recognition rate than KPCA and FLDA.

  2. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  3. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  4. Adaptive Fourier Decomposition Based Time-Frequency Analysis

    Institute of Scientific and Technical Information of China (English)

    Li-Ming Zhang

    2014-01-01

    The attempt to represent a signal simultaneously in time and frequency domains is full of challenges. The recently proposed adaptive Fourier decomposition (AFD) offers a practical approach to solve this problem. This paper presents the principles of the AFD based time-frequency analysis in three aspects: instantaneous frequency analysis, frequency spectrum analysis, and the spectrogram analysis. An experiment is conducted and compared with the Fourier transform in convergence rate and short-time Fourier transform in time-frequency distribution. The proposed approach performs better than both the Fourier transform and short-time Fourier transform.

  5. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  6. Noise analysis for sensitivity-based structural damage detection

    Institute of Scientific and Technical Information of China (English)

    YIN Tao; ZHU Hong-ping; YU Ling

    2007-01-01

    As vibration-based structural damage detection methods are easily affected by environmental noise, a new statistic-based noise analysis method is proposed together with the Monte Carlo technique to investigate the influence of experimental noise of modal data on sensitivity-based damage detection methods. Different from the commonly used random perturbation technique, the proposed technique is deduced directly by Moore-Penrose generalized inverse of the sensitivity matrix, which does not only make the analysis process more efficient but also can analyze the influence of noise on both frequencies and mode shapes for three commonly used sensitivity-based damage detection methods in a similar way. A one-story portal frame is adopted to evaluate the efficiency of the proposed noise analysis technique.

  7. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  8. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  9. A Framework for Web-Based Mechanical Design and Analysis

    Institute of Scientific and Technical Information of China (English)

    Chiaming; Yen; Wujeng; Li

    2002-01-01

    In this paper, a Web-based Mechanical Design and A na lysis Framework (WMDAF) is proposed. This WMADF allows designers to develop web -based computer aided programs in a systematic way during the collaborative mec hanical system design and analysis process. This system is based on an emerg ing web-based Content Management System (CMS) called eXtended Object Oriented P ortal System (XOOPS). Due to the Open Source Status of the XOOPS CMS, programs d eveloped with this framework can be further customized to ...

  10. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyse...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  11. Tikhonov regularization-based operational transfer path analysis

    Science.gov (United States)

    Cheng, Wei; Lu, Yingying; Zhang, Zhousuo

    2016-06-01

    To overcome ill-posed problems in operational transfer path analysis (OTPA), and improve the stability of solutions, this paper proposes a novel OTPA based on Tikhonov regularization, which considers both fitting degrees and stability of solutions. Firstly, fundamental theory of Tikhonov regularization-based OTPA is presented, and comparative studies are provided to validate the effectiveness on ill-posed problems. Secondly, transfer path analysis and source contribution evaluations for numerical cases studies on spherical radiating acoustical sources are comparatively studied. Finally, transfer path analysis and source contribution evaluations for experimental case studies on a test bed with thin shell structures are provided. This study provides more accurate transfer path analysis for mechanical systems, which can benefit for vibration reduction by structural path optimization. Furthermore, with accurate evaluation of source contributions, vibration monitoring and control by active controlling vibration sources can be effectively carried out.

  12. Matrix-based introduction to multivariate data analysis

    CERN Document Server

    Adachi, Kohei

    2016-01-01

    This book enables readers who may not be familiar with matrices to understand a variety of multivariate analysis procedures in matrix forms. Another feature of the book is that it emphasizes what model underlies a procedure and what objective function is optimized for fitting the model to data. The author believes that the matrix-based learning of such models and objective functions is the fastest way to comprehend multivariate data analysis. The text is arranged so that readers can intuitively capture the purposes for which multivariate analysis procedures are utilized: plain explanations of the purposes with numerical examples precede mathematical descriptions in almost every chapter. This volume is appropriate for undergraduate students who already have studied introductory statistics. Graduate students and researchers who are not familiar with matrix-intensive formulations of multivariate data analysis will also find the book useful, as it is based on modern matrix formulations with a special emphasis on ...

  13. Data Warehouse Requirements Analysis Framework: Business-Object Based Approach

    Directory of Open Access Journals (Sweden)

    Anirban Sarkar

    2012-01-01

    Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi

  14. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  15. Intelligent Hybrid Cluster Based Classification Algorithm for Social Network Analysis

    Directory of Open Access Journals (Sweden)

    S. Muthurajkumar

    2014-05-01

    Full Text Available In this paper, we propose an hybrid clustering based classification algorithm based on mean approach to effectively classify to mine the ordered sequences (paths from weblog data in order to perform social network analysis. In the system proposed in this work for social pattern analysis, the sequences of human activities are typically analyzed by switching behaviors, which are likely to produce overlapping clusters. In this proposed system, a robust Modified Boosting algorithm is proposed to hybrid clustering based classification for clustering the data. This work is useful to provide connection between the aggregated features from the network data and traditional indices used in social network analysis. Experimental results show that the proposed algorithm improves the decision results from data clustering when combined with the proposed classification algorithm and hence it is proved that of provides better classification accuracy when tested with Weblog dataset. In addition, this algorithm improves the predictive performance especially for multiclass datasets which can increases the accuracy.

  16. UML based risk analysis - Application to a medical robot

    OpenAIRE

    Guiochet, Jérémie; Baron, Claude

    2004-01-01

    Medical robots perform complex tasks and share their working area with humans. Therefore , they belong to safety critical systems. In nowadays development process, safety is often managed by the way of dependability techniques. We propose a new global approach , based on the risk concept in order to guide designers along the safety analysis of such complex systems. Safety depends on risk management activity, which core is risk analysis. This one consists in three steps: system definition, haz...

  17. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  18. Study of engine noise based on independent component analysis

    Institute of Scientific and Technical Information of China (English)

    HAO Zhi-yong; JIN Yan; YANG Chen

    2007-01-01

    Independent component analysis was applied to analyze the acoustic signals from diesel engine. First the basic principle of independent component analysis (ICA) was reviewed. Diesel engine acoustic signal was decomposed into several independent components (Ics); Fourier transform and continuous wavelet transform (CWT) were applied to analyze the independent components. Different noise sources of the diesel engine were separated, based on the characteristics of different component in time-frequency domain.

  19. iBarcode.org: web-based molecular biodiversity analysis

    OpenAIRE

    Hajibabaei Mehrdad; Singer Gregory AC

    2009-01-01

    Abstract Background DNA sequences have become a primary source of information in biodiversity analysis. For example, short standardized species-specific genomic regions, DNA barcodes, are being used as a global standard for species identification and biodiversity studies. Most DNA barcodes are being generated by laboratories that have an expertise in DNA sequencing but not in bioinformatics data analysis. Therefore, we have developed a web-based suite of tools to help the DNA barcode research...

  20. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  1. Segmentation of Stick Text Based on Sub Connected Area Analysis

    Institute of Scientific and Technical Information of China (English)

    高静波; 李新友; 等

    1998-01-01

    A new stick text segmentation method based on the sub connected area analysis is introduced in this paper.The foundation of this method is the sub connected area representation of text image that can represent all connected areas in an image efficiently.This method consists mainly of four steps:sub connected area classification,finding initial boundary following point,finding optimal segmentation point by boundary tracing,and text segmentaton.This method is similar to boundary analysis method but is more efficient than boundary analysis.

  2. RULE-BASED SENTIMENT ANALYSIS OF UKRAINIAN REVIEWS

    Directory of Open Access Journals (Sweden)

    Mariana Romanyshyn

    2013-07-01

    Full Text Available Last decade witnessed a lot of research in the field of sentiment analysis. Understanding the attitude and the emotions that people express in written text proved to be really important and helpful in sociology, political science, psychology, market research, and, of course, artificial intelligence. This paper demonstrates a rule-based approach to clause-level sentiment analysis of reviews in Ukrainian. The general architecture of the implemented sentiment analysis system is presented, the current stage of research is described and further work is explained. The main emphasis is made on the design of rules for computing sentiments.

  3. AN HMM BASED ANALYSIS FRAMEWORK FOR SEMANTIC VIDEO EVENTS

    Institute of Scientific and Technical Information of China (English)

    You Junyong; Liu Guizhong; Zhang Yaxin

    2007-01-01

    Semantic video analysis plays an important role in the field of machine intelligence and pattern recognition. In this paper, based on the Hidden Markov Model (HMM), a semantic recognition framework on compressed videos is proposed to analyze the video events according to six low-level features. After the detailed analysis of video events, the pattern of global motion and five features in foreground--the principal parts of videos, are employed as the observations of the Hidden Markov Model to classify events in videos. The applications of the proposed framework in some video event detections demonstrate the promising success of the proposed framework on semantic video analysis.

  4. Stability analysis of underground engineering based on multidisciplinary design optimization

    Institute of Scientific and Technical Information of China (English)

    MA Rong; ZHOU Ke-ping; GAO Feng

    2008-01-01

    Aiming at characteristics of underground engineering,analyzed the feasibility of Multidisciplinary Design Optimization (MDO) used in underground engineering,and put forward a modularization-based MDO method and the idea of MDO to resolve problems in stability analysis,proving the validity and feasibility of using MDO in underground engineering.Characteristics of uncertainty,complexity and nonlinear become bottle-neck to carry on underground engineering stability analysis by MDO.Therefore,the application of MDO in underground engineering stability analysis is still at a stage of exploration,which need some deep research.

  5. Stability analysis of underground engineering based on multidisciplinary design optimization

    Institute of Scientific and Technical Information of China (English)

    MA Rong; ZHOU Ke-ping; GAO Feng

    2008-01-01

    Aiming at characteristics of underground engineering, analyzed the feasibility of Multidisciplinary Design Optimization (MDO) used in underground engineering, and put forward a modularization-based MDO method and the idea of MDO to resolve problems in stability analysis, proving the validity and feasibility of using MDO in underground engi-neering. Characteristics of uncertainty, complexity and nonlinear become bottle-neck to carry on underground engineering stability analysis by MDO. Therefore, the application of MDO in underground engineering stability analysis is still at a stage of exploration, which need some deep research.

  6. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  7. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  8. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system. A comp...... is expressed and evaluated by a robustness index. Next, the robustness is assessed using system reliability indices where the probabilistic failure model is modelled by a series system of parallel systems....

  9. Customer-Classified Algorithm Based onFuzzy Clustering Analysis

    Institute of Scientific and Technical Information of China (English)

    郭蕴华; 祖巧红; 陈定方

    2004-01-01

    A customer-classified evaluation system is described with the customization-supporting tree of evaluation indexes, in which users can determine any evaluation index independently. Based on this system, a customer-classified algorithm based on fuzzy clustering analysis is proposed to implement the customer-classified management. A numerical example is presented, which provides correct results,indicating that the algorithm can be used in the decision support system of CRM.

  10. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  11. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  12. The analysis of Al-based alloys by calorimetry: quantitative analysis of reactions and reaction kinetics

    OpenAIRE

    Starink, M.J.

    2004-01-01

    Differential scanning calorimetry (DSC) and isothermal calorimetry have been applied extensively to the analysis of light metals, especially Al based alloys. Isothermal calorimetry and differential scanning calorimetry are used for analysis of solid state reactions, such as precipitation, homogenisation, devitrivication and recrystallisation; and solid–liquid reactions, such as incipient melting and solidification, are studied by differential scanning calorimetry. In producing repeatable calo...

  13. Modeling and Grid impedance Variation Analysis of Parallel Connected Grid Connected Inverter based on Impedance Based Harmonic Analysis

    DEFF Research Database (Denmark)

    Kwon, JunBum; Wang, Xiongfei; Bak, Claus Leth;

    2014-01-01

    This paper addresses the harmonic compensation error problem existing with parallel connected inverter in the same grid interface conditions by means of impedance-based analysis and modeling. Unlike the single grid connected inverter, it is found that multiple parallel connected inverters and grid...... impedance can make influence to each other if they each have a harmonic compensation function. The analysis method proposed in this paper is based on the relationship between the overall output impedance and input impedance of parallel connected inverter, where controller gain design method, which can...

  14. Situational Analysis: A Framework for Evidence-Based Practice

    Science.gov (United States)

    Annan, Jean

    2005-01-01

    Situational analysis is a framework for professional practice and research in educational psychology. The process is guided by a set of practice principles requiring that psychologists' work is evidence-based, ecological, collaborative and constructive. The framework is designed to provide direction for psychologists who wish to tailor their…

  15. Spinoza II: Conceptual Case-Based Natural Language Analysis.

    Science.gov (United States)

    Schank, Roger C.; And Others

    This paper presents the theoretical changes that have developed in Conceptual Dependency Theory and their ramifications in computer analysis of natural language. The major items of concern are: the elimination of reliance on "grammar rules" for parsing with the emphasis given to conceptual rule based parsing; the development of a conceptual case…

  16. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  17. Teaching of Editorial Writing Uses Claims-Based Analysis.

    Science.gov (United States)

    Porter, William C.

    1989-01-01

    Urges the use of claims-based analysis in editorial writing instruction. Explains the use of five hierarchical claim types (factual, definitional, causal, value, and policy) to teach students to analyze and formulate arguments, thus teaching editorial writing by focusing more on the process than on the product. (SR)

  18. Spironolactone use and renal toxicity: population based longitudinal analysis.

    OpenAIRE

    Wei, L; Struthers, A D; Fahey, T; Watson, A D; MacDonald, T. M.

    2010-01-01

    Objective To determine the safety of spironolactone prescribing in the setting of the UK National Health Service. Design Population based longitudinal analysis using a record linkage database. Setting Tayside, Scotland. Population All patients who received one or more dispensed prescriptions for spironolactone between 1994 and 2007. Main outcome measures Rates of prescribing for spironolactone, hospital admissions for hyperkalaemia, and hyperkalaemia and renal function without...

  19. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  20. Graph- versus Vector-Based Analysis of a Consensus Protocol

    NARCIS (Netherlands)

    Delzanno, Giorgio; Rensink, Arend; Traverso, Riccardo; Bošnački, Dragan; Edelkamp, Stefan; Lluch Lafuente, Alberto; Wijs, Anton

    2014-01-01

    The Paxos distributed consensus algorithm is a challenging case-study for standard, vector-based model checking techniques. Due to asynchronous communication, exhaustive analysis may generate very large state spaces already for small model instances. In this paper, we show the advantages of graph tr

  1. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  2. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    Science.gov (United States)

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  3. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Franqueira, Virginia N.L.; Tun, Thein Tan; Wieringa, Roel J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about atta

  4. GIS based analysis of future district heating potential in Denmark

    DEFF Research Database (Denmark)

    Nielsen, Steffen; Möller, Bernd

    2013-01-01

    in Denmark have been mapped in a heat atlas which includes all buildings and their heat demands. This article focuses on developing a method for assessing the costs associated with supplying these buildings with DH. The analysis is based on the existing DH areas in Denmark. By finding the heat production...

  5. A Corpus-based Analysis of English Noun Suffixes

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper provides a brief analysis of English suffixes. First, make a classification of the English noun suffixes etymologically; then, obtain the frequencies of each English noun suffixes in sub-corpus FR88 and WSJ88, and last draw a conclusion based on the statistics. That is from the word origins we can see its influences on English vocabulary.

  6. Project-Based Language Learning: An Activity Theory Analysis

    Science.gov (United States)

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  7. Management of Microbiologically Influenced Corrosion in Risk Based Inspection analysis

    DEFF Research Database (Denmark)

    Skovhus, Torben Lund; Hillier, Elizabeth; Andersen, Erlend S.

    2016-01-01

    Operating offshore oil and gas production facilities is often associated with high risk. In order to manage the risk, operators commonly use aids to support decision making in the establishment of a maintenance and inspection strategy. Risk Based Inspection (RBI) analysis is widely used in the of...

  8. Frailty phenotypes in the elderly based on cluster analysis

    DEFF Research Database (Denmark)

    Dato, Serena; Montesanto, Alberto; Lagani, Vincenzo;

    2012-01-01

    genetic background on the frailty status is still questioned. We investigated the applicability of a cluster analysis approach based on specific geriatric parameters, previously set up and validated in a southern Italian population, to two large longitudinal Danish samples. In both cohorts, we identified...

  9. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  10. LES based POD analysis of Jet in Cross Flow

    DEFF Research Database (Denmark)

    Cavar, Dalibor; Meyer, Knud Erik; Jakirlic, S.;

    2010-01-01

    The paper presents results of a POD investigation of the LES based numerical simulation of the jet-in-crossflow (JICF) flowfield. LES results are firstly compared to the pointwise LDA measurements. 2D POD analysis is then used as a comparison basis for PIV measurements and LES, and finally 3D POD...

  11. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  12. Teaching-Learning Activity Modeling Based on Data Analysis

    Directory of Open Access Journals (Sweden)

    Kyungrog Kim

    2015-03-01

    Full Text Available Numerous studies are currently being carried out on personalized services based on data analysis to find and provide valuable information about information overload. Furthermore, the number of studies on data analysis of teaching-learning activities for personalized services in the field of teaching-learning is increasing, too. This paper proposes a learning style recency-frequency-durability (LS-RFD model for quantified analysis on the level of activities of learners, to provide the elements of teaching-learning activities according to the learning style of the learner among various parameters for personalized service. This is to measure preferences as to teaching-learning activity according to recency, frequency and durability of such activities. Based on the results, user characteristics can be classified into groups for teaching-learning activity by categorizing the level of preference and activity of the learner.

  13. Unified HMM-based layout analysis framework and algorithm

    Institute of Scientific and Technical Information of China (English)

    陈明; 丁晓青; 吴佑寿

    2003-01-01

    To manipulate the layout analysis problem for complex or irregular document image, a Unified HMM-based Layout Analysis Framework is presented in this paper. Based on the multi-resolution wavelet analysis results of the document image, we use HMM method in both inner-scale image model and trans-scale context model to classify the pixel region properties, such as text, picture or background. In each scale, a HMM direct segmentation method is used to get better inner-scale classification result. Then another HMM method is used to fuse the inner-scale result in each scale and then get better final seg- mentation result. The optimized algorithm uses a stop rule in the coarse to fine multi-scale segmentation process, so the speed is improved remarkably. Experiments prove the efficiency of proposed algorithm.

  14. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    Science.gov (United States)

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  15. ODVBA: optimally-discriminative voxel-based analysis.

    Science.gov (United States)

    Zhang, Tianhao; Davatzikos, Christos

    2011-08-01

    Gaussian smoothing of images prior to applying voxel-based statistics is an important step in voxel-based analysis and statistical parametric mapping (VBA-SPM) and is used to account for registration errors, to Gaussianize the data and to integrate imaging signals from a region around each voxel. However, it has also become a limitation of VBA-SPM based methods, since it is often chosen empirically and lacks spatial adaptivity to the shape and spatial extent of the region of interest, such as a region of atrophy or functional activity. In this paper, we propose a new framework, named optimally-discriminative voxel-based analysis (ODVBA), for determining the optimal spatially adaptive smoothing of images, followed by applying voxel-based group analysis. In ODVBA, nonnegative discriminative projection is applied regionally to get the direction that best discriminates between two groups, e.g., patients and controls; this direction is equivalent to local filtering by an optimal kernel whose coefficients define the optimally discriminative direction. By considering all the neighborhoods that contain a given voxel, we then compose this information to produce the statistic for each voxel. Finally, permutation tests are used to obtain a statistical parametric map of group differences. ODVBA has been evaluated using simulated data in which the ground truth is known and with data from an Alzheimer's disease (AD) study. The experimental results have shown that the proposed ODVBA can precisely describe the shape and location of structural abnormality.

  16. Protein expression based multimarker analysis of breast cancer samples

    Directory of Open Access Journals (Sweden)

    Rajasekaran Ayyappan K

    2011-06-01

    Full Text Available Abstract Background Tissue microarray (TMA data are commonly used to validate the prognostic accuracy of tumor markers. For example, breast cancer TMA data have led to the identification of several promising prognostic markers of survival time. Several studies have shown that TMA data can also be used to cluster patients into clinically distinct groups. Here we use breast cancer TMA data to cluster patients into distinct prognostic groups. Methods We apply weighted correlation network analysis (WGCNA to TMA data consisting of 26 putative tumor biomarkers measured on 82 breast cancer patients. Based on this analysis we identify three groups of patients with low (5.4%, moderate (22% and high (50% mortality rates, respectively. We then develop a simple threshold rule using a subset of three markers (p53, Na-KATPase-β1, and TGF β receptor II that can approximately define these mortality groups. We compare the results of this correlation network analysis with results from a standard Cox regression analysis. Results We find that the rule-based grouping variable (referred to as WGCNA* is an independent predictor of survival time. While WGCNA* is based on protein measurements (TMA data, it validated in two independent Affymetrix microarray gene expression data (which measure mRNA abundance. We find that the WGCNA patient groups differed by 35% from mortality groups defined by a more conventional stepwise Cox regression analysis approach. Conclusions We show that correlation network methods, which are primarily used to analyze the relationships between gene products, are also useful for analyzing the relationships between patients and for defining distinct patient groups based on TMA data. We identify a rule based on three tumor markers for predicting breast cancer survival outcomes.

  17. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  18. Electronic Forms-Based Computing for Evidentiary Analysis

    Directory of Open Access Journals (Sweden)

    Andy Luse

    2009-09-01

    Full Text Available The paperwork associated with evidentiary collection and analysis is a highly repetitive and time-consuming process which often involves duplication of work and can frequently result in documentary errors. Electronic entry of evidencerelated information can facilitate greater accuracy and less time spent on data entry. This manuscript describes a general framework for the implementation of an electronic tablet-based system for evidentiary processing. This framework is then utilized in the design and implementation of an electronic tablet-based evidentiary input prototype system developed for use by forensic laboratories which serves as a verification of the proposed framework. The manuscript concludes with a discussion of implications and recommendations for the implementation and use of tablet-based computing for evidence analysis.

  19. Research on supplier evaluation and selection based on fuzzy hierarchy analysis and grey relational analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Considering the disadvantages of selecting evaluation index of supplier based on old purchase relation and in view of transformation of relation between manufacture and supplier under the dynamic, cooperative, competitive and quickly response environment, research on supplier selection evaluation was presented based on enterprise capability, cooperation degree and service level from the perspective of cooperative partnership and coordination, and the evaluation index system was established. A more objective and veracious supplier selection and evaluation method based on fuzzy analysis hierarchy process and grey relational analysis was developed, and then empirical research on electric equipment manufacturer was explored to analyze the supplier selection and evaluation.

  20. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  1. THE ENERGY ISSUES: A CORPUS-BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    Maria-Floriana Popescu

    2015-06-01

    Full Text Available The energy-related issues have become of paramount importance in recent years due to the exhaustion of fossil fuel resources, their price variations and political dependence on nations with role of energy providers. In addition, the changing in climate conditions requires reduction emissions of greenhouse gases. Therefore, this paper proposes and assesses the novel idea of using constructions as a unit of analysis for corpus-based discourse analysis in the energy field. This article will use a standard method in linguistics to quantitatively investigate academic writings from both a lexical and a stylistic perspective. The present research paper aims to add to the knowledge about energy issues by conducting a data-driven analysis of economic academic discourse, through three periods of time and using secondary data analysis. The findings suggest the evolution of the energy sector throughout time and they gave interesting and different insights into the content of the discourses and enabled better comparison of corpora.

  2. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  3. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    OpenAIRE

    Jingzheng Ren; Alessandro Manzardo; Anna Mazzi; Andrea Fedele; Antonio Scipioni

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as t...

  4. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  5. Analysis of system trustworthiness based on information flow noninterference theory

    Institute of Scientific and Technical Information of China (English)

    Xiangying Kong; Yanhui Chen; Yi Zhuang

    2015-01-01

    The trustworthiness analysis and evaluation are the bases of the trust chain transfer. In this paper the formal method of trustworthiness analysis of a system based on the noninterfer-ence (NI) theory of the information flow is studied. Firstly, existing methods cannot analyze the impact of the system states on the trustworthiness of software during the process of trust chain trans-fer. To solve this problem, the impact of the system state on trust-worthiness of software is investigated, the run-time mutual interfer-ence behavior of software entities is described and an interference model of the access control automaton of a system is established. Secondly, based on the intransitive noninterference (INI) theory, a formal analytic method of trustworthiness for trust chain transfer is proposed, providing a theoretical basis for the analysis of dynamic trustworthiness of software during the trust chain transfer process. Thirdly, a prototype system with dynamic trustworthiness on a plat-form with dual core architecture is constructed and a verification algorithm of the system trustworthiness is provided. Final y, the monitor hypothesis is extended to the dynamic monitor hypothe-sis, a theorem of static judgment rule of system trustworthiness is provided, which is useful to prove dynamic trustworthiness of a system at the beginning of system construction. Compared with previous work in this field, this research proposes not only a formal analytic method for the determination of system trustworthiness, but also a modeling method and an analysis algorithm that are feasible for practical implementation.

  6. ROOT based Offline and Online Analysis (ROAn): An analysis framework for X-ray detector data

    International Nuclear Information System (INIS)

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and only the necessary recalculations are triggered. This saves time and simultaneously keeps the results consistent. The viewer program offers a configurable Graphical User Interface (GUI) and process chain, which allows the user to adapt the program to different tasks such as offline viewing of file data, online monitoring of running detector systems, or performing online data analysis (histogramming, calibration, etc.). Because of its modular design, ROAn can be extended easily, e.g. be adapted to new detector types and analysis processes

  7. ROOT based Offline and Online Analysis (ROAn): An analysis framework for X-ray detector data

    Science.gov (United States)

    Lauf, Thomas; Andritschke, Robert

    2014-10-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and only the necessary recalculations are triggered. This saves time and simultaneously keeps the results consistent. The viewer program offers a configurable Graphical User Interface (GUI) and process chain, which allows the user to adapt the program to different tasks such as offline viewing of file data, online monitoring of running detector systems, or performing online data analysis (histogramming, calibration, etc.). Because of its modular design, ROAn can be extended easily, e.g. be adapted to new detector types and analysis processes.

  8. Consistency analysis of accelerated degradation mechanism based on gray theory

    Institute of Scientific and Technical Information of China (English)

    Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang

    2014-01-01

    A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.

  9. Risk-based planning analysis for a single levee

    Science.gov (United States)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  10. Barcode Server: A Visualization-Based Genome Analysis System

    Science.gov (United States)

    Mao, Fenglou; Olman, Victor; Wang, Yan; Xu, Ying

    2013-01-01

    We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a) identification of horizontally transferred genes, (b) identification of genomic islands with special properties and (c) binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a) calculation of the k-mer based barcode image for a provided DNA sequence; (b) detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c) clustering of provided DNA sequences into groups having similar barcodes; and (d) homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode. PMID:23457606

  11. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  12. Barcode server: a visualization-based genome analysis system.

    Science.gov (United States)

    Mao, Fenglou; Olman, Victor; Wang, Yan; Xu, Ying

    2013-01-01

    We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a) identification of horizontally transferred genes, (b) identification of genomic islands with special properties and (c) binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a) calculation of the k-mer based barcode image for a provided DNA sequence; (b) detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c) clustering of provided DNA sequences into groups having similar barcodes; and (d) homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode. PMID:23457606

  13. Towards Performance Measurement And Metrics Based Analysis of PLA Applications

    CERN Document Server

    Ahmed, Zeeshan

    2010-01-01

    This article is about a measurement analysis based approach to help software practitioners in managing the additional level complexities and variabilities in software product line applications. The architecture of the proposed approach i.e. ZAC is designed and implemented to perform preprocessesed source code analysis, calculate traditional and product line metrics and visualize results in two and three dimensional diagrams. Experiments using real time data sets are performed which concluded with the results that the ZAC can be very helpful for the software practitioners in understanding the overall structure and complexity of product line applications. Moreover the obtained results prove strong positive correlation between calculated traditional and product line measures.

  14. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  15. APL-based flexibility analysis of manufacturing grid

    Institute of Scientific and Technical Information of China (English)

    LIU Li-lan; SUN Xue-hua; CAI Hong-xia; CHAI Jian-fei

    2009-01-01

    With the characteristics of diversity, randomness, concurrency and decomposability, tasks in manufacturing field are very complicated, and so manufacturing grid(MG)should have considerable flexibility to deal with this problem. With the definition of node and arc, MG structure is converted into a small-world network. Given construction cost constraint, the problem of shortest task waiting time is transformed into the constrained optimization problem, and a corresponding flexibility analysis model based on average path length(APL)is proposed, and the premise of arc-length and node-distance are defined. The results of application example show that the analysis model is effective.

  16. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    Science.gov (United States)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  17. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Science.gov (United States)

    Liang, Jianming; Järvi, Timo; Kiuru, Aaro; Kormano, Martti; Svedström, Erkki

    2003-12-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT) and nuclear medicine (NM) studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  18. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  19. Open access for ALICE analysis based on virtualization technology

    Science.gov (United States)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  20. Job optimization in ATLAS TAG-based distributed analysis

    International Nuclear Information System (INIS)

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ('skimming', 'slimming' and 'thinning') as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  1. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  2. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  3. Coverage analysis for sensor networks based on Clifford algebra

    Institute of Scientific and Technical Information of China (English)

    XIE WeiXin; CAO WenMing; MENG Shan

    2008-01-01

    The coverage performance is the foundation of information acquisition in distrib-uted sensor networks. The previously proposed coverage work was mostly based on unit disk coverage model or ball coverage model in 2D or 3D space, respectively. However, most methods cannot give a homogeneous coverage model for targets with hybrid types. This paper presents a coverage analysis approach for sensor networks based on Clifford algebra and establishes a homogeneous coverage model for sensor networks with hybrid types of targets. The effectiveness of the approach is demonstrated with examples.

  4. Image Analysis of Fabric Pilling Based on Light Projection

    Institute of Scientific and Technical Information of China (English)

    陈霞; 黄秀宝

    2003-01-01

    The objective assessment of fabric pilling based on light projection and image analysis has been exploited recently.The device for capturing the cross-sectional images of the pilled fabrics with light projection is elaborated.The detection of the profile line and integration of the sequential cross-sectional pilled image are discussed.The threshold based on Gaussian model is recommended for pill segmentation.The results show that the installed system is capable of eliminating the interference with pill information from the fabric color and pattern.

  5. Image registration based on matrix perturbation analysis using spectral graph

    Institute of Scientific and Technical Information of China (English)

    Chengcai Leng; Zheng Tian; Jing Li; Mingtao Ding

    2009-01-01

    @@ We present a novel perspective on characterizing the spectral correspondence between nodes of the weighted graph with application to image registration.It is based on matrix perturbation analysis on the spectral graph.The contribution may be divided into three parts.Firstly, the perturbation matrix is obtained by perturbing the matrix of graph model.Secondly, an orthogonal matrix is obtained based on an optimal parameter, which can better capture correspondence features.Thirdly, the optimal matching matrix is proposed by adjusting signs of orthogonal matrix for image registration.Experiments on both synthetic images and real-world images demonstrate the effectiveness and accuracy of the proposed method.

  6. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  7. Physics-based stability analysis of MOS transistors

    Science.gov (United States)

    Ferrara, A.; Steeneken, P. G.; Boksteen, B. K.; Heringa, A.; Scholten, A. J.; Schmitz, J.; Hueting, R. J. E.

    2015-11-01

    In this work, a physics-based model is derived based on a linearization procedure for investigating the electrical, thermal and electro-thermal instability of power metal-oxide-semiconductor (MOS) transistors. The proposed model can be easily interfaced with a circuit or device simulator to perform a failure analysis, making it particularly useful for power transistors. Furthermore, it allows mapping the failure points on a three-dimensional (3D) space defined by the gate-width normalized drain current, drain voltage and junction temperature. This leads to the definition of the Safe Operating Volume (SOV), a powerful frame work for making failure predictions and determining the main root of instability (electrical, thermal or electro-thermal) in different bias and operating conditions. A comparison between the modeled and the measured SOV of silicon-on-insulator (SOI) LDMOS transistors is reported to support the validity of the proposed stability analysis.

  8. Image-Analysis Based on Seed Phenomics in Sesame

    Directory of Open Access Journals (Sweden)

    Prasad R.

    2014-10-01

    Full Text Available The seed coat (testa structure of twenty-three cultivated (Sesamum indicum L. and six wild sesame (s. occidentale Regel & Heer., S. mulayanum Nair, S. prostratum Retz., S. radiatum Schumach. & Thonn., S. angustifolium (Oliv. Engl. and S. schinzianum Asch germplasm was analyzed from digital and Scanning Electron Microscopy (SEM images with dedicated software using the descriptors for computer based seed image analysis to understand the diversity of seed morphometric traits, which later on can be extended to screen and evaluate improved genotypes of sesame. Seeds of wild sesame species could conveniently be distinguished from cultivated varieties based on shape and architectural analysis. Results indicated discrete ‘cut off values to identify definite shape and contour of seed for a desirable sesame genotype along with the con-ventional practice of selecting lighter colored testa.

  9. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  10. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  11. Empirical Analysis of Signature-Based Sign Language Recognition

    Directory of Open Access Journals (Sweden)

    Sumaira Kausar

    2014-10-01

    Full Text Available The significance of automated SLR (Sign Language Recognition proved not only in the deaf community but in various other spheres of life. The automated SLR are mainly based on the machine learning methods.PSL (Pakistani Sign Languageis an emerging area in order to benefit a big community in this region of the world. This paper presents recognition of PSL using machine learning methods. We propose an efficient and invariant method of classification of signs of PSL. This paper also presents a thorough empirical analysis of signature-based classification methods. Six different signatures are analyzed for two distinct groups of signs having total of forty five signs. Signs of PSL are close enough in terms of inter-sign similarity distancetherefore, it is a challenge to make the classification. Methodical empirical analysis proves that proposed method deals with these challenges adequately and effectively

  12. CORBA-Based Analysis of Multi Agent Behavior

    Institute of Scientific and Technical Information of China (English)

    Swapan Bhattacharya; Anirban Banerjee; Shibdas Bandyopadhyay

    2005-01-01

    An agent is a computer software that is capable of taking independent action on behalf of its user or owner. It is an entity with goals, actions and domain knowledge, situated in an environment. Multiagent systems comprises of multiple autonomous, interacting computer software, or agents. These systems can successfully emulate the entities active in a distributed environment. The analysis of multiagent behavior has been studied in this paper based on a specific board game problem similar to the famous problem of GO. In this paper a framework is developed to define the states of the multiagent entities and measure the convergence metrics for this problem. An analysis of the changes of states leading to the goal state is also made. We support our study of multiagent behavior by simulations based on a CORBA framework in order to substantiate our findings.

  13. Protein analysis based on molecular beacon probes and biofunctionalized nanoparticles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    With the completion of the human genome-sequencing project, there has been a resulting change in the focus of studies from genomics to proteomics. By utilizing the inherent advantages of molecular beacon probes and biofunctionalized nanoparticles, a series of novel principles, methods and techniques have been exploited for bioanalytical and biomedical studies. This review mainly discusses the applications of molecular beacon probes and biofunctionalized nanoparticles-based technologies for realtime, in-situ, highly sensitive and highly selective protein analysis, including the nonspecific or specific protein detection and separation, protein/DNA interaction studies, cell surface protein recognition, and antigen-antibody binding process-based bacteria assays. The introduction of molecular beacon probes and biofunctionalized nanoparticles into the protein analysis area would necessarily advance the proteomics research.

  14. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters. PMID:27305769

  15. Classification analysis of microarray data based on ontological engineering

    Institute of Scientific and Technical Information of China (English)

    LI Guo-qi; SHENG Huan-ye

    2007-01-01

    Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to provide background knowledge to direct the process of data mining. This paper gives a common introduction to the method and presents a practical analysis example using SVM (support vector machine) as the classifier. Gene Ontology and the accompanying annotations compose a big knowledge base, on which many researches have been carried out. Microarray dataset is the output of DNA chip.With the help of Gene Ontology we present a more elaborate analysis on microarray data than former researchers. The method can also be used in other fields with similar scenario.

  16. Student Engagement: A Principle-Based Concept Analysis.

    Science.gov (United States)

    Bernard, Jean S

    2015-08-04

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.

  17. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950s and 1960s,Contractive Analysis (CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors. CA soon lost its popularity.Error Analysis (EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  18. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950 s and 1960 s,Contractive Analysis(CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors.CA soon lost its popularity.Error Analysis(EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  19. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  20. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  1. Error Analysis of Robotic Assembly System Based on Screw Theory

    Institute of Scientific and Technical Information of China (English)

    韩卫军; 费燕琼; 赵锡芳

    2003-01-01

    Assembly errors have great influence on assembly quality in robotic assembly systems. Error analysis is directed to the propagations and accumula-tions of various errors and their effect on assembly success.Using the screw coordinates, assembly errors are represented as "error twist", the extremely compact expression. According to the law of screw composition, relative position and orientation errors of mating parts are computed and the necessary condition of assembly success is concluded. A new simple method for measuring assembly errors is also proposed based on the transformation law of a screw.Because of the compact representation of error, the model presented for error analysis can be applied to various part- mating types and especially useful for error analysis of complexity assembly.

  2. Student Engagement: A Principle-Based Concept Analysis.

    Science.gov (United States)

    Bernard, Jean S

    2015-01-01

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline. PMID:26234950

  3. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Science.gov (United States)

    Lee, Chien-Cheng; Huang, Shin-Sheng; Shih, Cheng-Yuan

    2010-12-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA) and quadratic discriminant analysis (QDA). It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO) algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  4. Analysis Of Japans Economy Based On 2014 From Macroeconomics Prospects

    Directory of Open Access Journals (Sweden)

    Dr Mohammad Rafiqul Islam

    2015-02-01

    Full Text Available Abstract Japan is the worlds third largest economy. But currently economic situations of Japan are not stable. It is not increasing as expected. Since 2013 it was world second largest economy but Japan loosed its placed to China in 2014 due to slow growth of important economic indicators. By using the basic Keynesian model we will provide a detailed analysis of the short and long run impacts of the changes for Japans real GDP rate of unemployment and inflation rate. We demonstrated a detailed use of the 45-degree diagram or the AD-IA model and other economic analysis of the macroeconomic principles that underlie the model and concepts. Finally we will recommend the government with a change in fiscal policy what based on the analysis by considering what might be achieved with a fiscal policy response and the extent to which any impact on the stock of public debt might be a consideration

  5. INNOVATION ANALYSIS BASED ON SCORES AT THE FIRM LEVEL

    OpenAIRE

    Cătălin George ALEXE; Cătălina Monica ALEXE; Gheorghe MILITARU

    2014-01-01

    Innovation analysis based on scores (Innovation Scorecard) is a simple way to get a quick diagnosis on the potential of innovation of a firm in its intention to achieve the innovation capability. It aims to identify and remedy the deficient aspects related to innovation management being used as a measuring tool for the innovation initiatives over time within the innovation audit. The paper aims to present the advantages and disadvantages of using the method, and the three approaches devel...

  6. COMMERCIAL VIABILITY ANALYSIS OF LIGNIN BASED CARBON FIBRE

    OpenAIRE

    Michael Chien-Wei Chen

    2014-01-01

    Lignin is a rich renewable source of aromatic compounds. As a potentialpetroleum feedstock replacement, lignin can reduce environmental impacts such ascarbon emission. Due to its complex chemical structure, lignin is currently underutilized.Exploiting lignin as a precursor for carbon fibre adds high economic value to lignin andencourages further development in lignin extraction technology. This report includes apreliminary cost analysis and identifies the key aspects of lignin-based carbon fi...

  7. Wavelet Variance Analysis of EEG Based on Window Function

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yuan-zhuang; YOU Rong-yi

    2014-01-01

    A new wavelet variance analysis method based on window function is proposed to investigate the dynamical features of electroencephalogram (EEG).The ex-prienmental results show that the wavelet energy of epileptic EEGs are more discrete than normal EEGs, and the variation of wavelet variance is different between epileptic and normal EEGs with the increase of time-window width. Furthermore, it is found that the wavelet subband entropy (WSE) of the epileptic EEGs are lower than the normal EEGs.

  8. Study on Segmented Reflector Lamp Design Based on Error Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper discusses the basic principle and design m ethod for light distribution of car lamp, introduces an important development: h igh efficient and flexible car lamp with reflecting light distribution-segmente d reflector (multi-patch) car lamp, and puts out a design method for segmented reflector based on error analysis. Unlike classical car lamp with refractive lig ht distribution, the method of reflecting light distribution gives car lamp desi gn more flexibility. In the case of guarantying the li...

  9. INDIVIDUAL COMMUNICATION TRANSMITTER IDENTIFICATION BASED ON MULTIFRACTAL ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Ren Chunhui; Wei Ping; Lou Zhiyou; Xiao Xianci

    2005-01-01

    In this letter, the communication transmitter transient signals are analyzed based on the time-variant hierarchy exponents of multifractal analysis. The species of optimized sample set is selected as the template of transmitter identification, so that the individual communication transmitter identification can be realized. The turn-on signals of four transmitters are used in the simulation. The experimental results show that the multifractal character of transmitter transient signals is an effective character of individual transmitter identification.

  10. Performance Analysis of STFT Based Timing Approach to OFDM Systems

    Institute of Scientific and Technical Information of China (English)

    KUANG Yu-jun; TENG Yong; YIN Chang-chuan; HAO Jian-jun; YUE Guang-xin

    2003-01-01

    This paper mainly focuses on performance analysis of the previously proposed STFT based 2-D timing approach to OFDM systems and presents simulations results of its performance in AWGN and multipath fading environment and its robustness against the duration of Channel Impulse Response (CIR) and frequency offset. Simulation results suggest that a revised version of Short-Time Fourier Transform (STFT) can be used to greatly reduce computational complexity, especially at higher SNR.

  11. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  12. BLAT-Based Comparative Analysis for Transposable Elements: BLATCAT

    OpenAIRE

    Sangbum Lee; Sumin Oh; Keunsoo Kang; Kyudong Han

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program ca...

  13. Performance-analysis-based gas turbine diagnostics: a review.

    OpenAIRE

    Li, Y.G.

    2002-01-01

    Gas turbine diagnostics has a history almost as long as gas turbine development itself. Early engine fault diagnosis was carried out based on manufacturer information supplied in a technical manual combined with maintenance experience. In the late 1960’s when Urban introduced Gas Path Analysis, gas turbine diagnostics made a big breakthrough. Since then different methods have been developed and used in both aero and industrial applications. Until now a substantial number of papers have been p...

  14. Innovative Data Mining Based Approaches for Life Course Analysis

    OpenAIRE

    Ritschard, Gilbert; Gabadinho, Alexis; Mueller, Nicolas Séverin; Studer, Matthias

    2007-01-01

    This communication presents a just starting research project aiming at exploring the possibilities of resorting to data-mining-based methods in personal life course analysis. The project has also a socio-demographic goal, namely to gain new insights on how socio-demographic, familial, educational and professional events are entwined, on what are the characteristics of typical Swiss life trajectories and on changes in these characteristics over time. Methods for analyzing personal event histor...

  15. A Multilevel Analysis of Problem-Based Learning Design Characteristics

    OpenAIRE

    Scott, Kimberly S.

    2014-01-01

    The increasing use of experience-centered approaches like problem-based learning (PBL) by learning and development practitioners and management educators has raised interest in how to design, implement and evaluate PBL in that field. Of particular interest is how to evaluate the relative impact of design characteristics that exist at the individual and team levels of analysis. This study proposes and tests a multilevel model of PBL design characteristics. Participant perceptions of PBL design...

  16. An Ultrasound Image Despeckling Approach Based on Principle Component Analysis

    OpenAIRE

    Jawad F. Al-Asad; Ali M. Reza; Udomchai Techavipoo

    2014-01-01

    An approach based on principle component analysis (PCA) to filter out multiplicative noise from ultrasound images is presented in this paper. An image with speckle noise is segmented into small dyadic lengths, depending on the original size of the image, and the global covariance matrix is found. A projection matrix is then formed by selecting the maximum eigenvectors of the global covariance matrix. This projection matrix is used to filter speckle noise by projecting each segment into the si...

  17. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    OpenAIRE

    Lee Chien-Cheng; Huang Shin-Sheng; Shih Cheng-Yuan

    2010-01-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RD...

  18. Analysis of Corporate Governance Performance Based on Grey System Theory

    OpenAIRE

    Hong Li

    2009-01-01

    This article sets up the grey relation analysis (GRA) model, and studies the case of the internal relevance between the factors of independent director participation of corporate governance and corporate performance, based on the listed companies of Electricity production sector and commercial retail sector in 2004. The research shows that, among the 4 impact factors of independent director participation of corporate governance, the most important factor impacting the achievement of the compa...

  19. Moon-Based INSAR Geolocation and Baseline Analysis

    Science.gov (United States)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  20. Wavelet-based multiresolution analysis of Wivenhoe Dam water temperatures

    Science.gov (United States)

    Percival, D. B.; Lennox, S. M.; Wang, Y.-G.; Darnell, R. E.

    2011-05-01

    Water temperature measurements from Wivenhoe Dam offer a unique opportunity for studying fluctuations of temperatures in a subtropical dam as a function of time and depth. Cursory examination of the data indicate a complicated structure across both time and depth. We propose simplifying the task of describing these data by breaking the time series at each depth into physically meaningful components that individually capture daily, subannual, and annual (DSA) variations. Precise definitions for each component are formulated in terms of a wavelet-based multiresolution analysis. The DSA components are approximately pairwise uncorrelated within a given depth and between different depths. They also satisfy an additive property in that their sum is exactly equal to the original time series. Each component is based upon a set of coefficients that decomposes the sample variance of each time series exactly across time and that can be used to study both time-varying variances of water temperature at each depth and time-varying correlations between temperatures at different depths. Each DSA component is amenable for studying a certain aspect of the relationship between the series at different depths. The daily component in general is weakly correlated between depths, including those that are adjacent to one another. The subannual component quantifies seasonal effects and in particular isolates phenomena associated with the thermocline, thus simplifying its study across time. The annual component can be used for a trend analysis. The descriptive analysis provided by the DSA decomposition is a useful precursor to a more formal statistical analysis.

  1. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  2. Weather data analysis based on typical weather sequence analysis. Application: energy building simulation

    CERN Document Server

    David, Mathieu; Garde, Francois; Boyer, Harry

    2014-01-01

    In building studies dealing about energy efficiency and comfort, simulation software need relevant weather files with optimal time steps. Few tools generate extreme and mean values of simultaneous hourly data including correlation between the climatic parameters. This paper presents the C++ Runeole software based on typical weather sequences analysis. It runs an analysis process of a stochastic continuous multivariable phenomenon with frequencies properties applied to a climatic database. The database analysis associates basic statistics, PCA (Principal Component Analysis) and automatic classifications. Different ways of applying these methods will be presented. All the results are stored in the Runeole internal database that allows an easy selection of weather sequences. The extreme sequences are used for system and building sizing and the mean sequences are used for the determination of the annual cooling loads as proposed by Audrier-Cros (Audrier-Cros, 1984). This weather analysis was tested with the datab...

  3. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  4. Voxel selection in FMRI data analysis based on sparse representation.

    Science.gov (United States)

    Li, Yuanqing; Namburi, Praneeth; Yu, Zhuliang; Guan, Cuntai; Feng, Jianfeng; Gu, Zhenghui

    2009-10-01

    Multivariate pattern analysis approaches toward detection of brain regions from fMRI data have been gaining attention recently. In this study, we introduce an iterative sparse-representation-based algorithm for detection of voxels in functional MRI (fMRI) data with task relevant information. In each iteration of the algorithm, a linear programming problem is solved and a sparse weight vector is subsequently obtained. The final weight vector is the mean of those obtained in all iterations. The characteristics of our algorithm are as follows: 1) the weight vector (output) is sparse; 2) the magnitude of each entry of the weight vector represents the significance of its corresponding variable or feature in a classification or regression problem; and 3) due to the convergence of this algorithm, a stable weight vector is obtained. To demonstrate the validity of our algorithm and illustrate its application, we apply the algorithm to the Pittsburgh Brain Activity Interpretation Competition 2007 functional fMRI dataset for selecting the voxels, which are the most relevant to the tasks of the subjects. Based on this dataset, the aforementioned characteristics of our algorithm are analyzed, and a comparison between our method with the univariate general-linear-model-based statistical parametric mapping is performed. Using our method, a combination of voxels are selected based on the principle of effective/sparse representation of a task. Data analysis results in this paper show that this combination of voxels is suitable for decoding tasks and demonstrate the effectiveness of our method. PMID:19567340

  5. Direct DNA Analysis with Paper-Based Ion Concentration Polarization.

    Science.gov (United States)

    Gong, Max M; Nosrati, Reza; San Gabriel, Maria C; Zini, Armand; Sinton, David

    2015-11-01

    DNA analysis is essential for diagnosis and monitoring of many diseases. Conventional DNA testing is generally limited to the laboratory. Increasing access to relevant technologies can improve patient care and outcomes in both developed and developing regions. Here, we demonstrate direct DNA analysis in paper-based devices, uniquely enabled by ion concentration polarization at the interface of patterned nanoporous membranes in paper (paper-based ICP). Hepatitis B virus DNA targets in human serum are simultaneously preconcentrated, separated, and detected in a single 10 min operation. A limit of detection of 150 copies/mL is achieved without prior viral load amplification, sufficient for early diagnosis of hepatitis B. We clinically assess the DNA integrity of sperm cells in raw human semen samples. The percent DNA fragmentation results from the paper-based ICP devices strongly correlate (R(2) = 0.98) with the sperm chromatin structure assay. In all cases, agreement was 100% with respect to the clinical decision. Paper-based ICP can provide inexpensive and accessible advanced molecular diagnostics.

  6. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    Science.gov (United States)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  7. Voxel-based texture analysis of the brain.

    Science.gov (United States)

    Maani, Rouzbeh; Yang, Yee Hong; Kalra, Sanjay

    2015-01-01

    This paper presents a novel voxel-based method for texture analysis of brain images. Texture analysis is a powerful quantitative approach for analyzing voxel intensities and their interrelationships, but has been thus far limited to analyzing regions of interest. The proposed method provides a 3D statistical map comparing texture features on a voxel-by-voxel basis. The validity of the method was examined on artificially generated effects as well as on real MRI data in Alzheimer's Disease (AD). The artificially generated effects included hyperintense and hypointense signals added to T1-weighted brain MRIs from 30 healthy subjects. The AD dataset included 30 patients with AD and 30 age/sex matched healthy control subjects. The proposed method detected artificial effects with high accuracy and revealed statistically significant differences between the AD and control groups. This paper extends the usage of texture analysis beyond the current region of interest analysis to voxel-by-voxel 3D statistical mapping and provides a hypothesis-free analysis tool to study cerebral pathology in neurological diseases.

  8. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  9. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna;

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various...... kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...... to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified....

  10. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  11. Adaptive Human aware Navigation based on Motion Pattern Analysis

    DEFF Research Database (Denmark)

    Tranberg, Søren; Svenstrup, Mikael; Andersen, Hans Jørgen;

    2009-01-01

    Respecting people’s social spaces is an important prerequisite for acceptable and natural robot navigation in human environments. In this paper, we describe an adaptive system for mobile robot navigation based on estimates of whether a person seeks to interact with the robot or not. The estimates...... are based on run-time motion pattern analysis compared to stored experience in a database. Using a potential field centered around the person, the robot positions itself at the most appropriate place relative to the person and the interaction status. The system is validated through qualitative tests...... in a real world setting. The results demonstrate that the system is able to learn to navigate based on past interaction experiences, and to adapt to different behaviors over time....

  12. Depth-based selective image reconstruction using spatiotemporal image analysis

    Science.gov (United States)

    Haga, Tetsuji; Sumi, Kazuhiko; Hashimoto, Manabu; Seki, Akinobu

    1999-03-01

    In industrial plants, a remote monitoring system which removes physical tour inspection is often considered desirable. However the image sequence given from the mobile inspection robot is hard to see because interested objects are often partially occluded by obstacles such as pillars or fences. Our aim is to improve the image sequence that increases the efficiency and reliability of remote visual inspection. We propose a new depth-based image processing technique, which removes the needless objects from the foreground and recovers the occluded background electronically. Our algorithm is based on spatiotemporal analysis that enables fine and dense depth estimation, depth-based precise segmentation, and accurate interpolation. We apply this technique to a real time sequence given from the mobile inspection robot. The resulted image sequence is satisfactory in that the operator can make correct visual inspection with less fatigue.

  13. FLDA: Latent Dirichlet Allocation Based Unsteady Flow Analysis.

    Science.gov (United States)

    Hong, Fan; Lai, Chufan; Guo, Hanqi; Shen, Enya; Yuan, Xiaoru; Li, Sikun

    2014-12-01

    In this paper, we present a novel feature extraction approach called FLDA for unsteady flow fields based on Latent Dirichlet allocation (LDA) model. Analogous to topic modeling in text analysis, in our approach, pathlines and features in a given flow field are defined as documents and words respectively. Flow topics are then extracted based on Latent Dirichlet allocation. Different from other feature extraction methods, our approach clusters pathlines with probabilistic assignment, and aggregates features to meaningful topics at the same time. We build a prototype system to support exploration of unsteady flow field with our proposed LDA-based method. Interactive techniques are also developed to explore the extracted topics and to gain insight from the data. We conduct case studies to demonstrate the effectiveness of our proposed approach. PMID:26356968

  14. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos

    2012-03-01

    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  15. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton

    2012-01-01

    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  16. DNA sequence analysis using hierarchical ART-based classification networks

    Energy Technology Data Exchange (ETDEWEB)

    LeBlanc, C.; Hruska, S.I. [Florida State Univ., Tallahassee, FL (United States); Katholi, C.R.; Unnasch, T.R. [Univ. of Alabama, Birmingham, AL (United States)

    1994-12-31

    Adaptive resonance theory (ART) describes a class of artificial neural network architectures that act as classification tools which self-organize, work in real-time, and require no retraining to classify novel sequences. We have adapted ART networks to provide support to scientists attempting to categorize tandem repeat DNA fragments from Onchocerca volvulus. In this approach, sequences of DNA fragments are presented to multiple ART-based networks which are linked together into two (or more) tiers; the first provides coarse sequence classification while the sub- sequent tiers refine the classifications as needed. The overall rating of the resulting classification of fragments is measured using statistical techniques based on those introduced to validate results from traditional phylogenetic analysis. Tests of the Hierarchical ART-based Classification Network, or HABclass network, indicate its value as a fast, easy-to-use classification tool which adapts to new data without retraining on previously classified data.

  17. Clustering analysis of ancient celadon based on SOM neural network

    Institute of Scientific and Technical Information of China (English)

    ZHOU ShaoHuai; FU Lue; LIANG BaoLiu

    2008-01-01

    In the study,chemical compositions of 48 fragments of ancient ceramics excavated in 4 archaeological kiln sites which were located in 3 cities (Hangzhou,Cixi and Longquan in Zhejiang Province,China) have been examined by energy-dispersive X-ray fluorescence (EDXRF) technique.Then the method of SOM was introduced into the clustering analysis based on the major and minor element compositions of the bodies,the results manifested that 48 samples could be perfectly distributed into 3 locations,Hangzhou,Cixi and Longquan.Because the major and minor ele-ment compositions of two Royal Kilns were similar to each other,the classification accuracy over them was merely 76.92%.In view of this,the authors have made a SOM clustering analysis again based on the trace element compositions of the bodies,the classification accuracy rose to 84.61%.These results indicated that discrepancies in the trace element compositions of the bodies of the ancient ce-ramics excavated in two Royal Kiln sites were more distinct than those in the major and minor element compositions,which was in accordance with the fact.We ar-gued that SOM could be employed in the clustering analysis of ancient ceramics.

  18. Clustering analysis of ancient celadon based on SOM neural network

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In the study, chemical compositions of 48 fragments of ancient ceramics excavated in 4 archaeological kiln sites which were located in 3 cities (Hangzhou, Cixi and Longquan in Zhejiang Province, China) have been examined by energy-dispersive X-ray fluorescence (EDXRF) technique. Then the method of SOM was introduced into the clustering analysis based on the major and minor element compositions of the bodies, the results manifested that 48 samples could be perfectly distributed into 3 locations, Hangzhou, Cixi and Longquan. Because the major and minor element compositions of two Royal Kilns were similar to each other, the classification accuracy over them was merely 76.92%. In view of this, the authors have made a SOM clustering analysis again based on the trace element compositions of the bodies, the classification accuracy rose to 84.61%. These results indicated that discrepancies in the trace element compositions of the bodies of the ancient ceramics excavated in two Royal Kiln sites were more distinct than those in the major and minor element compositions, which was in accordance with the fact. We argued that SOM could be employed in the clustering analysis of ancient ceramics.

  19. Automatic analysis of a skull fracture based on image content

    Science.gov (United States)

    Shao, Hong; Zhao, Hong

    2003-09-01

    Automatic analysis based on image content is a hotspot with bright future of medical image diagnosis technology research. Analysis of the fracture of skull can help doctors diagnose. In this paper, a new approach is proposed to automatically detect the fracture of skull based on CT image content. First region growing method, whose seeds and growing rules are chosen by k-means clustering dynamically, is applied for image automatic segmentation. The segmented region boundary is found by boundary tracing. Then the shape of the boundary is analyzed, and the circularity measure is taken as description parameter. At last the rules for computer automatic diagnosis of the fracture of the skull are reasoned by entropy function. This method is used to analyze the images from the third ventricles below layer to cerebral cortex top layer. Experimental result shows that the recognition rate is 100% for the 100 images, which are chosen from medical image database randomly and are not included in the training examples. This method integrates color and shape feature, and isn't affected by image size and position. This research achieves high recognition rate and sets a basis for automatic analysis of brain image.

  20. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  1. Analysis of Host-Based and Network-Based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Amrit Pal Singh

    2014-07-01

    Full Text Available Intrusion-detection systems (IDS aim at de-tecting attacks against computer systems and networks or, in general, against information systems. Its basic aim is to protect the system against malwares and unauthorized access of a network or a system. Intrusion Detection is of two types Network-IDS and Host Based- IDS. This paper covers the scope of both the types and their result analysis along with their comparison as stated. OSSEC (HIDS is a free, open source host-base intrusion detection system. It performs log analysis, integrity checking, Windows registry monitoring, rootkit detection, time-based alerting and active response. While Snort (NIDS is a lightweight intrusion detection system that can log packets coming across your network and can alert the user regarding any attack. Both are efficient in their own distinct fields.

  2. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow.

  3. Finite element analysis of osteoporosis models based on synchrotron radiation

    International Nuclear Information System (INIS)

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m−1 to 697.41 Fμ m−1, the bending and torsion stiffness were from 1390.80 Fμ m−1 to 566.11 Fμ m−1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis

  4. Finite element analysis of osteoporosis models based on synchrotron radiation

    Science.gov (United States)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  5. Principle-based concept analysis: Caring in nursing education

    Science.gov (United States)

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development

  6. Whole-transcriptome analysis of chordoma of the skull base.

    Science.gov (United States)

    Bell, Diana; Raza, Shaan M; Bell, Achim H; Fuller, Gregory N; DeMonte, Franco

    2016-10-01

    Fourteen skull base chordoma specimens and three normal specimens were microdissected from paraffin-embedded tissue. Pools of RNA from highly enriched preparations of these cell types were subjected to expression profiling using whole-transcriptome shotgun sequencing. Using strict criteria, 294 differentially expressed transcripts were found, with 28 % upregulated and 72 % downregulated. The transcripts were annotated using NCBI Entrez Gene and computationally analyzed with the Ingenuity Pathway Analysis program. From these significantly changed expressions, the analysis identified 222 cancer-related transcripts. These 294 differentially expressed genes and non-coding RNA transcripts provide here a set to specifically define skull base chordomas and to identify novel and potentially important targets for diagnosis, prognosis, and therapy of this cancer. Significance Genomic profiling to subtype skull base chordoma reveals potential candidates for specific biomarkers, with validation by IHC for selected candidates. The highly expressed developmental genes T, LMX1A, ZIC4, LHX4, and HOXA1 may be potential drivers of this disease.

  7. Weight measurement using image-based pose analysis

    Institute of Scientific and Technical Information of China (English)

    Hong Zhang; Kui Zhang; Ying Mu; Ning Yao; Robert J. Sclabassi; Mingui Sun

    2008-01-01

    Image-based gait analysis as a means of biometric identification has attracted much research attention.Most of the existing methods focus on human identification,posture analysis and movement tracking.There have been few investigations on measuring the carried load based on the carrier's gait characteristics by automatic image processing.Nevertheless,this measurement is very useful in a number of applications,such as the study of the carried load on the postural development of children and adolescence.In this paper,we inves-tigate how to automatically estimate the carried weight from a sequence of images.We present a method to extract human gait silhouette based on an observation that humans tend to minimize the energy during motion.We compute several angles of body leaning and deter-mine the relationship of the carried weight,the leaning angles and the centroid location according to a human kinetic study.Our weight determination method has been verified successfully by experiments.

  8. Medical diagnostics by laser-based analysis of exhaled breath

    Science.gov (United States)

    Giubileo, Gianfranco

    2002-08-01

    IMany trace gases can be found in the exhaled breath, some of them giving the possibility of a non invasive diagnosis of related diseases or allowing the monitoring of the disease in the course of its therapy. In the present lecture the principle of medical diagnosis based on the breath analysis will be introduced and the detection of trace gases in exhaled breath by high- resolution molecular spectroscopy in the IR spectral region will be discussed. A number of substrates and the optical systems for their laser detection will be reported. The following laser based experimental systems has been realised in the Molecular Spectroscopy Laboratory in ENEA in Frascati for the analysis of specific substances in the exhaled breath. A tuneable diode laser absorption spectroscopy (TDLAS) appartus for the measurement of 13C/12C isotopic ratio in carbon dioxide, a TDLAS apparatus for the detection of CH4 and a CO2 laser based photoacoustic system to detect trace ethylene at atmospheric pressure. The experimental set-up for each one of the a.m. optical systems will be shown and the related medical applications will be illustrated. The concluding remarks will be focuses on chemical species that are of major interest for medical people today and their diagnostic ability.

  9. [Determination of body fluid based on analysis of nucleic acids].

    Science.gov (United States)

    Korabečná, Marie

    2015-01-01

    Recent methodological approaches of molecular genetics allow isolation of nucleic acids (DNA and RNA) from negligible forensic samples. Analysis of these molecules may be used not only for individual identification based on DNA profiling but also for the detection of origin of the body fluid which (alone or in mixture with other body fluids) forms the examined biological trace. Such an examination can contribute to the evaluation of procedural, technical and tactical value of the trace. Molecular genetic approaches discussed in the review offer new possibilities in comparison with traditional spectrum of chemical, immunological and spectroscopic tests especially with regard to the interpretation of mixtures of biological fluids and to the confirmatory character of the tests. Approaches based on reverse transcription of tissue specific mRNA and their subsequent polymerase chain reaction (PCR) and fragmentation analysis are applicable on samples containing minimal amounts of biological material. Methods for body fluid discrimination based on examination of microRNA in samples provided so far confusing results therefore further development in this field is needed. The examination of tissue specific methylation of nucleotides in selected gene sequences seems to represent a promising enrichment of the methodological spectrum. The detection of DNA sequences of tissue related bacteria has been established and it provides satisfactory results mainly in combination with above mentioned methodological approaches. PMID:26419517

  10. ALGORITHMS FOR TENNIS RACKET ANALYSIS BASED ON MOTION DATA

    Directory of Open Access Journals (Sweden)

    Maria Skublewska-Paszkowska

    2016-09-01

    Full Text Available Modern technologies, such as motion capture systems (both optical and markerless, are more and more frequently used for athlete performance analysis due to their great precision. Optical systems based on the retro-reflective markers allow for tracking motion of multiple objects of various types. These systems compute human kinetic and kinematic parameters based on biomechanical models. Tracking additional objects like a tennis racket is also a very important aspect for analysing the player’s technique and precision. The motion data gathered by motion capture systems may be used for analysing various aspects that may not be recognised by the human eye or a video camera. This paper presents algorithms for analysis of a tennis racket motion during two of the most important tennis strokes: forehand and backhand. An optical Vicon system was used for obtaining the motion data which was the input for the algorithms. They indicate: the velocity of a tennis racket’s head and the racket’s handle based on the trajectories of attached markers as well as the racket’s orientation. The algorithms were implemented and tested on the data obtained from a professional trainer who participated in the research and performed a series of ten strikes, separately for: 1 forehand without a ball, 2 backhand without a ball, 3 forehand with a ball and 4 backhand with a ball. The computed parameters are gathered in tables and visualised in a graph.

  11. Beyond the GUM: variance-based sensitivity analysis in metrology

    Science.gov (United States)

    Lira, I.

    2016-07-01

    Variance-based sensitivity analysis is a well established tool for evaluating the contribution of the uncertainties in the inputs to the uncertainty in the output of a general mathematical model. While the literature on this subject is quite extensive, it has not found widespread use in metrological applications. In this article we present a succinct review of the fundamentals of sensitivity analysis, in a form that should be useful to most people familiarized with the Guide to the Expression of Uncertainty in Measurement (GUM). Through two examples, it is shown that in linear measurement models, no new knowledge is gained by using sensitivity analysis that is not already available after the terms in the so-called ‘law of propagation of uncertainties’ have been computed. However, if the model behaves non-linearly in the neighbourhood of the best estimates of the input quantities—and if these quantities are assumed to be statistically independent—sensitivity analysis is definitely advantageous for gaining insight into how they can be ranked according to their importance in establishing the uncertainty of the measurand.

  12. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  13. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  14. fMRI time series analysis based on stationary wavelet and spectrum analysis

    Institute of Scientific and Technical Information of China (English)

    ZHI Lianhe; ZHAO Xia; SHAN Baoci; PENG Silong; YAN Qiang; YUAN Xiuli; TANG Xiaowei

    2006-01-01

    The low signal to noise ratio (SNR) of functional MRI (fMRI) prefers more sensitive data analysis methods. Based on stationary wavelet transform and spectrum analysis, a new method with high detective sensitivity was developed for analyzing fMRI time series, which does not require any prior assumption of the characteristics of noises. In the proposed method, every component of fMRI time series in the different time-frequency scales of stationary wavelet transform was discerned by the spectrum analysis, then the components from noises were removed using the stationary wavelet transform, finally the components of real brain activation were detected by cross-correlation analysis. The results obtained from both simulated and in vivo visual experiments illustrated that the proposed method has much higher sensitivity than the traditional cross-correlation method.

  15. Kernel-based fisher discriminant analysis for hyperspectral target detection

    Institute of Scientific and Technical Information of China (English)

    GU Yan-feng; ZHANG Ye; YOU Di

    2007-01-01

    A new method based on kernel Fisher discriminant analysis (KFDA) is proposed for target detection of hyperspectral images. The KFDA combines kernel mapping derived from support vector machine and the classical linear Fisher discriminant analysis (LFDA), and it possesses good ability to process nonlinear data such as hyperspectral images. According to the Fisher rule that the ratio of the between-class and within-class scatters is maximized, the KFDA is used to obtain a set of optimal discriminant basis vectors in high dimensional feature space. All pixels in the hyperspectral images are projected onto the discriminant basis vectors and the target detection is performed according to the projection result. The numerical experiments are performed on hyperspectral data with 126 bands collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). The experimental results show the effectiveness of the proposed detection method and prove that this method has good ability to overcome small sample size and spectral variability in the hyperspectral target detection.

  16. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... the geometry, material properties and fixed point characteristics to calculate the dimensions and subsequent feasibility of any architectural design. The proposed conceptual design tool provides the possibility for the architect to work with both the aesthetic as well as the structural aspects of architecture...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...

  17. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  18. Psychoacoustic Music Analysis Based on the Discrete Wavelet Packet Transform

    Directory of Open Access Journals (Sweden)

    Xing He

    2008-01-01

    Full Text Available Psychoacoustical computational models are necessary for the perceptual processing of acoustic signals and have contributed significantly in the development of highly efficient audio analysis and coding. In this paper, we present an approach for the psychoacoustic analysis of musical signals based on the discrete wavelet packet transform. The proposed method mimics the multiresolution properties of the human ear closer than other techniques and it includes simultaneous and temporal auditory masking. Experimental results show that this method provides better masking capabilities and it reduces the signal-to-masking ratio substantially more than other approaches, without introducing audible distortion. This model can lead to greater audio compression by permitting further bit rate reduction and more secure watermarking by providing greater signal space for information hiding.

  19. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  20. Multiwave velocity analysis based on Gaussian beam prestack depth migration

    Institute of Scientific and Technical Information of China (English)

    Han Jian-Guang; Wang Yun; Han Ning; Xing Zhan-Tao; Lu Jun

    2014-01-01

    Prestack depth migration of multicomponent seismic data improves the imaging accuracy of subsurface complex geological structures. An accurate velocityfi eld is critical to accurate imaging. Gaussian beam migration was used to perform multicomponent migration velocity analysis of PP- and PS-waves. First, PP- and PS-wave Gaussian beam prestack depth migration algorithms that operate on common-offset gathers are presented to extract offset-domain common-image gathers of PP- and PS-waves. Second, based on the residual moveout equation, the migration velocity fields of P- and S-waves are updated. Depth matching is used to ensure that the depth of the target layers in the PP- and PS-wave migration profi les are consistent, and high-precision P- and S-wave velocities are obtained. Finally, synthetic andfi eld seismic data suggest that the method can be used effectively in multiwave migration velocity analysis.

  1. Analysis and design of a smart card based authentication protocol

    Institute of Scientific and Technical Information of China (English)

    Kuo-Hui YEH; Kuo-Yu TSAI; Jia-Li HOU

    2013-01-01

    Numerous smart card based authentication protocols have been proposed to provide strong system security and robust individual privacy for communication between parties these days. Nevertheless, most of them do not provide formal analysis proof, and the security robustness is doubtful. Chang and Cheng (2011) proposed an efficient remote authentication protocol with smart cards and claimed that their proposed protocol could support secure communication in a multi-server environment. Unfortunately, there are opportunities for security enhancement in current schemes. In this paper, we identify the major weakness, i.e., session key disclosure, of a recently published protocol. We consequently propose a novel authentication scheme for a multi-server envi-ronment and give formal analysis proofs for security guarantees.

  2. Carbon nanotube based VLSI interconnects analysis and design

    CERN Document Server

    Kaushik, Brajesh Kumar

    2015-01-01

    The brief primarily focuses on the performance analysis of CNT based interconnects in current research scenario. Different CNT structures are modeled on the basis of transmission line theory. Performance comparison for different CNT structures illustrates that CNTs are more promising than Cu or other materials used in global VLSI interconnects. The brief is organized into five chapters which mainly discuss: (1) an overview of current research scenario and basics of interconnects; (2) unique crystal structures and the basics of physical properties of CNTs, and the production, purification and applications of CNTs; (3) a brief technical review, the geometry and equivalent RLC parameters for different single and bundled CNT structures; (4) a comparative analysis of crosstalk and delay for different single and bundled CNT structures; and (5) various unique mixed CNT bundle structures and their equivalent electrical models.

  3. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  4. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  5. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  6. Statistical mechanics of light elements at high pressure. IV - A model free energy for the metallic phase. [for Jovian type planet interiors

    Science.gov (United States)

    Dewitt, H. E.; Hubbard, W. B.

    1976-01-01

    A large quantity of data on the thermodynamic properties of hydrogen-helium metallic liquids have been obtained in extended computer calculations in which a Monte Carlo code essentially identical to that described by Hubbard (1972) was used. A model free energy for metallic hydrogen with a relatively small mass fraction of helium is discussed, taking into account the definition of variables, a procedure for choosing the free energy, values for the fitting parameters, and the evaluation of the entropy constants. Possibilities concerning a use of the obtained data in studies of the interiors of the outer planets are briefly considered.

  7. Pressure Control in Distillation Columns: A Model-Based Analysis

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Bisgaard, Thomas; Kristensen, Henrik;

    2014-01-01

    A comprehensive assessment of pressure control in distillation columns is presented, including the consequences for composition control and energy consumption. Two types of representative control structures are modeled, analyzed, and benchmarked. A detailed simulation test, based on a real...... industrial distillation column, is used to assess the differences between the two control structures and to demonstrate the benefits of pressure control in the operation. In the second part of the article, a thermodynamic analysis is carried out to establish the influence of pressure on relative volatility...

  8. Differentiation-Based Analysis of Environmental Management and Corporate Performance

    Institute of Scientific and Technical Information of China (English)

    SHAN Dong-ming; MU Xin

    2007-01-01

    By building a duopoly model based on product differentiation, both of the clean firm's and the dirty firm's performances are studied under the assumptions that consumers have different preferences for the product environmental attributes, and that the product cost increases with the environmental attribute. The analysis results show that under either the case with no environmental regulation or that with a tariff levied on the dirty product, the clean firm would always get more profit. In addition, the stricter the regulation is, the more profit the clean firm would obtain. This can verify that from the view of product differentiation, a firm could improve its corporate competitiveness with environmental management.

  9. A Developed Algorithm of Apriori Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    LI Pingxiang; CHEN Jiangping; BIAN Fuling

    2004-01-01

    A method for mining frequent itemsets by evaluating their probability of supports based on association analysis is presented. This paper obtains the probability of every 1-itemset by scanning the database, then evaluates the probability of every 2-itemset, every 3-itemset, every k-itemset from the frequent 1-itemsets and gains all the candidate frequent itemsets. This paper also scans the database for verifying the support of the candidate frequent itemsets. Last, the frequent itemsets are mined. The method reduces a lot of time of scanning database and shortens the computation time of the algorithm.

  10. Image edge detection based on multi-fractal spectrum analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-yuan; WANG Yao-nan

    2006-01-01

    In this paper,an image edge detection method based on multi-fractal spectrum analysis is presented.The coarse grain H(o)lder exponent of the image pixels is first computed,then,its multi-fractal spectrum is estimated by the kernel estimation method.Finally,the image edge detection is done by means of different multi-fractal spectrum values.Simulation results show that this method is efficient and has better locality compared with the traditional edge detection methods such as the Sobel method.

  11. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems......-tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...

  12. Rural Power System Load Forecast Based on Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    Fang Jun-long; Xing Yu; Fu Yu; Xu Yang; Liu Guo-liang

    2015-01-01

    Power load forecasting accuracy related to the development of the power system. There were so many factors influencing the power load, but their effects were not the same and what factors played a leading role could not be determined empirically. Based on the analysis of the principal component, the paper forecasted the demands of power load with the method of the multivariate linear regression model prediction. Took the rural power grid load for example, the paper analyzed the impacts of different factors on power load, selected the forecast methods which were appropriate for using in this area, forecasted its 2014-2018 electricity load, and provided a reliable basis for grid planning.

  13. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Science.gov (United States)

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  14. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  15. Machine Learning for Vision-Based Motion Analysis

    CERN Document Server

    Wang, Liang; Cheng, Li; Pietikainen, Matti

    2011-01-01

    Techniques of vision-based motion analysis aim to detect, track, identify, and generally understand the behavior of objects in image sequences. With the growth of video data in a wide range of applications from visual surveillance to human-machine interfaces, the ability to automatically analyze and understand object motions from video footage is of increasing importance. Among the latest developments in this field is the application of statistical machine learning algorithms for object tracking, activity modeling, and recognition. Developed from expert contributions to the first and second In

  16. Selecting supplier combination based on fuzzy multicriteria analysis

    Science.gov (United States)

    Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.

    2015-07-01

    Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.

  17. Supermarket Analysis Based On Product Discount and Statistics

    Directory of Open Access Journals (Sweden)

    Komal Kumawat

    2014-03-01

    Full Text Available E-commerce has been growing rapidly. Its domain can provide all the right ingredients for successful data mining and it is a significant domain of data mining. E commerce refers to buying and selling of products or services over electronic systems such as internet. Various e commerce systems give discount on product and allow user to buy product online. The basic idea used here is to predict the product sale based on discount applied to the product. Our analysis concentrates on how customer behaves when discount is allotted to him. We have developed a model which finds the customer behaviour when discount is applied to the product. This paper elaborates upon how a different technique like session, click stream is used to collect user data online based on discount applied to the product and how statistics is applied to data set to see the variation in the data.

  18. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    Science.gov (United States)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  19. Single base pair mutation analysis by PNA directed PCR clamping

    DEFF Research Database (Denmark)

    Ørum, H.; Nielsen, P.E.; Egholm, M.;

    1993-01-01

    A novel method that allows direct analysis of single base mutation by the polymerase chain reaction (PCR) is described. The method utilizes the finding that PNAs (peptide nucleic acids) recognize and bind to their complementary nucleic acid sequences with higher thermal stability and specificity...... than the corresponding deoxyribooligonucleotides and that they cannot function as primers for DNA polymerases. We show that a PNA/DNA complex can effectively block the formation of a PCR product when the PNA is targeted against one of the PCR primer sites. Furthermore, we demonstrate that this blockage...... allows selective amplification/suppression of target sequences that differ by only one base pair. Finally we show that PNAs can be designed in such a way that blockage can be accomplished when the PNA target sequence is located between the PCR primers....

  20. Support vector classifier based on principal component analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Support vector classifier (SVC) has the superior advantages for small sample learning problems with high dimensions,with especially better generalization ability.However there is some redundancy among the high dimensions of the original samples and the main features of the samples may be picked up first to improve the performance of SVC.A principal component analysis (PCA) is employed to reduce the feature dimensions of the original samples and the pre-selected main features efficiently,and an SVC is constructed in the selected feature space to improve the learning speed and identification rate of SVC.Furthermore,a heuristic genetic algorithm-based automatic model selection is proposed to determine the hyperparameters of SVC to evaluate the performance of the learning machines.Experiments performed on the Heart and Adult benchmark data sets demonstrate that the proposed PCA-based SVC not only reduces the test time drastically,but also improves the identify rates effectively.

  1. BLAT-based comparative analysis for transposable elements: BLATCAT.

    Science.gov (United States)

    Lee, Sangbum; Oh, Sumin; Kang, Keunsoo; Han, Kyudong

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque) on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes. PMID:24959585

  2. BLAT-based comparative analysis for transposable elements: BLATCAT.

    Science.gov (United States)

    Lee, Sangbum; Oh, Sumin; Kang, Keunsoo; Han, Kyudong

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque) on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes.

  3. BLAT-Based Comparative Analysis for Transposable Elements: BLATCAT

    Directory of Open Access Journals (Sweden)

    Sangbum Lee

    2014-01-01

    Full Text Available The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT based comparative analysis for transposable elements (BLATCAT program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes.

  4. State Inspection for Transmission Lines Based on Independent Component Analysis

    Institute of Scientific and Technical Information of China (English)

    REN Li-jia; JIANG Xiu-chen; SHENG Ge-hao; YANG Wei-wei

    2009-01-01

    Monitoring transmission towers is of great importance to prevent severe thefts on them and ensure the reliability and safety of the power grid operation. Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate statistical data based on dimension reduction methods, and it is applicable to extract the non-stationary signals. FastICA based on negentropy is presented to effectively extract and separate the vibration signals caused by human activity in this paper. A new method combined empirical mode decomposition (EMD) technique with the adaptive threshold method is applied to extract the vibration pulses, and suppress the interference signals. The practical tests demonstrate that the method proposed in the paper is effective in separating and extracting the vibration signals.

  5. Least-squares deconvolution based analysis of stellar spectra

    CERN Document Server

    Van Reeth, T; Tsymbal, V

    2013-01-01

    In recent years, astronomical photometry has been revolutionised by space missions such as MOST, CoRoT and Kepler. However, despite this progress, high-quality spectroscopy is still required as well. Unfortunately, high-resolution spectra can only be obtained using ground-based telescopes, and since many interesting targets are rather faint, the spectra often have a relatively low S/N. Consequently, we have developed an algorithm based on the least-squares deconvolution profile, which allows to reconstruct an observed spectrum, but with a higher S/N. We have successfully tested the method using both synthetic and observed data, and in combination with several common spectroscopic applications, such as e.g. the determination of atmospheric parameter values, and frequency analysis and mode identification of stellar pulsations.

  6. Sensitivity analysis of GSI based mechanical characterization of rock mass

    CERN Document Server

    Ván, P

    2012-01-01

    Recently, the rock mechanical and rock engineering designs and calculations are frequently based on Geological Strength Index (GSI) method, because it is the only system that provides a complete set of mechanical properties for design purpose. Both the failure criteria and the deformation moduli of the rock mass can be calculated with GSI based equations, which consists of the disturbance factor, as well. The aim of this paper is the sensitivity analysis of GSI and disturbance factor dependent equations that characterize the mechanical properties of rock masses. The survey of the GSI system is not our purpose. The results show that the rock mass strength calculated by the Hoek-Brown failure criteria and both the Hoek-Diederichs and modified Hoek-Diederichs deformation moduli are highly sensitive to changes of both the GSI and the D factor, hence their exact determination is important for the rock engineering design.

  7. Sensitivity Analysis of a Bioinspired Refractive Index Based Gas Sensor

    Institute of Scientific and Technical Information of China (English)

    Yang Gao; Qi Xia; Guanglan Liao; Tielin Shi

    2011-01-01

    It was found out that the change of refractive index of ambient gas can lead to obvious change of the color of Morpho butterfly's wing. Such phenomenon has been employed as a sensing principle for detecting gas. In the present study, Rigorous Coupled-Wave Analysis (RCWA) was described briefly, and the partial derivative of optical reflection efficiency with respect to the refractive index of ambient gas, i.e., sensitivity of the sensor, was derived based on RCWA. A bioinspired grating model was constructed by mimicking the nanostructure on the ground scale of Morpho didius butterfly's wing. The analytical sensitivity was verified and the effect of the grating shape on the reflection spectra and its sensitivity were discussed. The results show that by tuning shape parameters of the grating, we can obtain desired reflection spectra and sensitivity, which can be applied to the design of the bioinspired refractive index based gas sensor.

  8. Echo-waveform classification using model and model free techniques: Experimental study results from central western continental shelf of India

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Navelkar, G.S.; Desai, R.G.P.; Janakiraman, G.; Mahale, V.; Fernandes, W.A.; Rao, N.

    seafloor of India, but unable to provide a suitable means for seafloor classification. This paper also suggests a hybrid artificial neural network (ANN) architecture i.e. Learning Vector Quantisation (LVQ) for seafloor classification. An analysis...

  9. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  10. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  11. Data Clustering Analysis Based on Wavelet Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    QIANYuntao; TANGYuanyan

    2003-01-01

    A novel wavelet-based data clustering method is presented in this paper, which includes wavelet feature extraction and cluster growing algorithm. Wavelet transform can provide rich and diversified information for representing the global and local inherent structures of dataset. therefore, it is a very powerful tool for clustering feature extraction. As an unsupervised classification, the target of clustering analysis is dependent on the specific clustering criteria. Several criteria that should be con-sidered for general-purpose clustering algorithm are pro-posed. And the cluster growing algorithm is also con-structed to connect clustering criteria with wavelet fea-tures. Compared with other popular clustering methods,our clustering approach provides multi-resolution cluster-ing results,needs few prior parameters, correctly deals with irregularly shaped clusters, and is insensitive to noises and outliers. As this wavelet-based clustering method isaimed at solving two-dimensional data clustering prob-lem, for high-dimensional datasets, self-organizing mapand U-matrlx method are applied to transform them intotwo-dimensional Euclidean space, so that high-dimensional data clustering analysis,Results on some sim-ulated data and standard test data are reported to illus-trate the power of our method.

  12. Reachability analysis based transient stability design in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Licheng; Kumar, Ratnesh; Elia, Nicola [Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50010 (United States)

    2010-09-15

    This paper provides a systematic framework to determine switching control strategies to stabilize the system after a fault if the stabilization is possible. A method to compute the stability region of a stable equilibrium point with the purpose of power system stability analysis is proposed and the validity of discrete controls in transient stability design is studied. First, a Hamilton-Jacobi-Isaas (HJI) partial differential equation (PDE) is constructed to describe the set of backward reachable states as a function of time starting from a target set of states. The backward reachable set of a stable equilibrium point is computed by numerically solving the HJI PDE backwardly in time using level set methods. This backward reachable set yields the stability region of the equilibrium point. Based on such reachability analysis, a transient stability design method is presented. The validity of a discrete control is determined by examining the stability region of the power system with the said control on. If a post-fault initial state is in the stability region of the system with a control on, the control is valid. A control strategy is provided based on the validity of controls. Finally, this method is illustrated by applying to a single machine infinite bus system with the compensation of shunt and series capacitors. (author)

  13. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  14. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  15. Reliability analysis of cluster-based ad-hoc networks

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Jason L. [Quality Engineering and System Assurance, Armament Research Development Engineering Center, Picatinny Arsenal, NJ (United States); Ramirez-Marquez, Jose Emmanuel [School of Systems and Enterprises, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 (United States)], E-mail: Jose.Ramirez-Marquez@stevens.edu

    2008-10-15

    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks.

  16. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  17. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  18. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Directory of Open Access Journals (Sweden)

    Edi Sutoyo

    Full Text Available Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  19. An Efficient Soft Set-Based Approach for Conflict Analysis

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  20. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  1. Service quality measurement. A new approach based on Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Valerio Gatta

    2013-03-01

    Full Text Available This article is concerned with the measurement of service quality. The main objective is to suggest an alternative criterion for service quality definition and measurement. After a brief description of the most traditional techniques and with the intent to overcome some critical factors pertaining them, I focus my attention on the choice-based conjoint analysis, a particular stated preferences method that estimates the structure of consumers’ preferences given their choices between alternative service options. Discrete choice models and the traditional compensatory utility maximization framework are extended by the inclusion of the attribute cutoffs into the decision problem formulation. The major theoretical aspects of the described approach are examined and discussed, showing that it is able to identify the relative importance of the relevant attributes, calculating elasticity and monetary evaluation, and to determine a service quality index. Then simulations enable the identification of potential service quality levels, so that marketing managers have valuable information to plan their best business strategies. We present findings from an empirical study in the public transport sector designed to gain insights into the use of the choice-based conjoint analysis.

  2. Emergy analysis of cassava-based fuel ethanol in China

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hui; Chen, Li; Yan, Zongcheng; Wang, Honglin [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou, Guangdong 510640 (China)

    2011-01-15

    Emergy analysis considers both energy quality and energy used in the past, and compensates for the inability of money to value non-market inputs in an objective manner. Its common unit allows all resources to be compared on a fair basis. As feedstock for fuel ethanol, cassava has some advantages over other feedstocks. The production system of cassava-based fuel ethanol (CFE) was evaluated by emergy analysis. The emergy indices for the system of cassava-based fuel ethanol (CFE) are as follows: transformity is 1.10 E + 5 sej/J, EYR is 1.07, ELR is 2.55, RER is 0.28, and ESI is 0.42. Compared with the emergy indices of wheat ethanol and corn ethanol, CFE is the most sustainable. CFE is a good alternative to substitute for oil in China. Non-renewable purchased emergy accounts for 71.15% of the whole input emergy. The dependence on non-renewable energy increases environmental degradation, making the system less sustainable relative to systems more dependent on renewable energies. For sustainable development, it is vital to reduce the consumption of non-renewable energy in the production of CFE. (author)

  3. Emergy analysis of cassava-based fuel ethanol in China

    International Nuclear Information System (INIS)

    Emergy analysis considers both energy quality and energy used in the past, and compensates for the inability of money to value non-market inputs in an objective manner. Its common unit allows all resources to be compared on a fair basis. As feedstock for fuel ethanol, cassava has some advantages over other feedstocks. The production system of cassava-based fuel ethanol (CFE) was evaluated by emergy analysis. The emergy indices for the system of cassava-based fuel ethanol (CFE) are as follows: transformity is 1.10 E + 5 sej/J, EYR is 1.07, ELR is 2.55, RER is 0.28, and ESI is 0.42. Compared with the emergy indices of wheat ethanol and corn ethanol, CFE is the most sustainable. CFE is a good alternative to substitute for oil in China. Non-renewable purchased emergy accounts for 71.15% of the whole input emergy. The dependence on non-renewable energy increases environmental degradation, making the system less sustainable relative to systems more dependent on renewable energies. For sustainable development, it is vital to reduce the consumption of non-renewable energy in the production of CFE. (author)

  4. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  5. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  6. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr3 (Ce) fast scintillator detector and 252Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  7. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    Science.gov (United States)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  8. Graph-Based Analysis of Nuclear Smuggling Data

    International Nuclear Information System (INIS)

    Much of the data that is collected and analyzed today is structural, consisting not only of entities but also of relationships between the entities. As a result, analysis applications rely upon automated structural data mining approaches to find patterns and concepts of interest. This ability to analyze structural data has become a particular challenge in many security-related domains. In these domains, focusing on the relationships between entities in the data is critical to detect important underlying patterns. In this study we apply structural data mining techniques to automate analysis of nuclear smuggling data. In particular, we choose to model the data as a graph and use graph-based relational learning to identify patterns and concepts of interest in the data. In this paper, we identify the analysis questions that are of importance to security analysts and describe the knowledge representation and data mining approach that we adopt for this challenge. We analyze the results using the Russian nuclear smuggling event database.

  9. Kinematics Analysis Based on Screw Theory of a Humanoid Robot

    Institute of Scientific and Technical Information of China (English)

    MAN Cui-hua; FAN Xun; LI Cheng-rong; ZHAO Zhong-hui

    2007-01-01

    A humanoid robot is a complex dynamic system for its idiosyncrasy. This paper aims to provide a mathematical and theoretical foundation for the design of the configuration, kinematics analysis of a novel humanoid robot. It has a simplified configuration and design for entertainment purpose. The design methods, principle and mechanism are discussed. According to the design goals of this research, there are ten degrees of freedom in the two bionic arms.Modularization, concurrent design and extension theory methods were adopted in the configuration study and screw theory was introduced into the analysis of humanoid robot kinematics. Comparisons with other methods show that: 1) only two coordinates need to be established in the kinematics analysis of humanoid robot based on screw theory; 2) the spatial manipulator Jacobian obtained by using twist and exponential product formula is succinct and legible; 3) adopting screw theory to resolve the humanoid robot arms kinematics question can avoid singularities; 4) using screw theory can solve the question of specification insufficiency.

  10. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  11. Structural Optimization of Slender Robot Arm Based on Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2012-01-01

    Full Text Available An effective structural optimization method based on a sensitivity analysis is proposed to optimize the variable section of a slender robot arm. The structure mechanism and the operating principle of a polishing robot are introduced firstly, and its stiffness model is established. Then, a design of sensitivity analysis method and a sequential linear programming (SLP strategy are developed. At the beginning of the optimization, the design sensitivity analysis method is applied to select the sensitive design variables which can make the optimized results more efficient and accurate. In addition, it can also be used to determine the scale of moving step which will improve the convergency during the optimization process. The design sensitivities are calculated using the finite difference method. The search for the final optimal structure is performed using the SLP method. Simulation results show that the proposed structure optimization method is effective in enhancing the stiffness of the robot arm regardless of the robot arm suffering either a constant force or variable forces.

  12. Dermoscopy analysis of RGB-images based on comparative features

    Science.gov (United States)

    Myakinin, Oleg O.; Zakharov, Valery P.; Bratchenko, Ivan A.; Artemyev, Dmitry N.; Neretin, Evgeny Y.; Kozlov, Sergey V.

    2015-09-01

    In this paper, we propose an algorithm for color and texture analysis for dermoscopic images of human skin based on Haar wavelets, Local Binary Patterns (LBP) and Histogram Analysis. This approach is a modification of «7-point checklist» clinical method. Thus, that is an "absolute" diagnostic method because one is using only features extracted from tumor's ROI (Region of Interest), which can be selected manually and/or using a special algorithm. We propose additional features extracted from the same image for comparative analysis of tumor and healthy skin. We used Euclidean distance, Cosine similarity, and Tanimoto coefficient as comparison metrics between color and texture features extracted from tumor's and healthy skin's ROI separately. A classifier for separating melanoma images from other tumors has been built by SVM (Support Vector Machine) algorithm. Classification's errors with and without comparative features between skin and tumor have been analyzed. Significant increase of recognition quality with comparative features has been demonstrated. Moreover, we analyzed two modes (manual and automatic) for ROI selecting on tumor and healthy skin areas. We have reached 91% of sensitivity using comparative features in contrast with 77% of sensitivity using the only "absolute" method. The specificity was the invariable (94%) in both cases.

  13. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  14. Technoeconomic analysis of a biomass based district heating system

    International Nuclear Information System (INIS)

    This paper discussed a proposed biomass-based district heating system to be built for the Pictou Landing First Nation Community in Nova Scotia. The community centre consists of 6 buildings and a connecting arcade. The methodology used to size and design heating, ventilating and air conditioning (HVAC) systems, as well as biomass district energy systems (DES) were discussed. Annual energy requirements and biomass fuel consumption predictions were presented, along with cost estimates. A comparative assessment of the system with that of a conventional oil fired system was also conducted. It was suggested that the design and analysis methodology could be used for any similar application. The buildings were modelled and simulated using the Hourly Analysis Program (HAP), a detailed 2-in-1 software program which can be used both for HVAC system sizing and building energy consumption estimation. A techno-economics analysis was conducted to justify the viability of the biomass combustion system. Heating load calculations were performed assuming that the thermostat was set constantly at 22 degrees C. Community centre space heating loads due to individual envelope components for 3 different scenarios were summarized, as the design architecture for the buildings was not yet finalized. It was suggested that efforts should be made to ensure air-tightness and insulation levels of the interior arcade glass wall. A hydronic distribution system with baseboard space heating units was selected, comprising of a woodchip boiler, hot water distribution system, convective heating units and control systems. The community has its own logging operation which will provide the wood fuel required by the proposed system. An outline of the annual allowable harvest covered by the Pictou Landing Forestry Management Plan was presented, with details of proposed wood-chippers for the creation of biomass. It was concluded that the woodchip combustion system is economically preferable to the

  15. Dynamic chest image analysis: model-based pulmonary perfusion analysis with pyramid images

    Science.gov (United States)

    Liang, Jianming; Haapanen, Arto; Jaervi, Timo; Kiuru, Aaro J.; Kormano, Martti; Svedstrom, Erkki; Virkki, Raimo

    1998-07-01

    The aim of the study 'Dynamic Chest Image Analysis' is to develop computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected at different phases of the respiratory/cardiac cycles in a short period of time. We have proposed a framework for ventilation study with an explicit ventilation model based on pyramid images. In this paper, we extend the framework to pulmonary perfusion study. A perfusion model and the truncated pyramid are introduced. The perfusion model aims at extracting accurate, geographic perfusion parameters, and the truncated pyramid helps in understanding perfusion at multiple resolutions and speeding up the convergence process in optimization. Three cases are included to illustrate the experimental results.

  16. Web Based Image Retrieval System Using Color, Texture and Shape Analysis: Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Amol P Bhagat

    2013-09-01

    Full Text Available The internet is one of the best media to disseminate scientific and technological research results [1, 2, 6]. It deals with the implementation of a web-based extensible architecture that is easily integral with applications written in different languages and linkable with different data sources. This paper work deals with developing architecture which is expandable and modular; its client–server functionalities permit easily building web applications that can be run using any Internet browser without compatibility problems regarding platform, program and operating system installed. This paper presents the implementation of Content Based Image Retrieval using different methods of color, texture and shape analysis. The primary objective is to compare the different methods of image analysis.

  17. Meta-Analysis of Soybean-based Biodiesel.

    Science.gov (United States)

    Sieverding, Heidi L; Bailey, Lisa M; Hengen, Tyler J; Clay, David E; Stone, James J

    2015-07-01

    Biofuel policy changes in the United States have renewed interest in soybean [ (L.) Merr.] biodiesel. Past studies with varying methodologies and functional units can provide valuable information for future work. A meta-analysis of nine peer-reviewed soybean life cycle analysis (LCA) biodiesel studies was conducted on the northern Great Plains in the United States. Results of LCA studies were assimilated into a standardized system boundary and functional units for global warming (GWP), eutrophication (EP), and acidification (AP) potentials using biodiesel conversions from peer-reviewed and government documents. Factors not fully standardized included variations in NO accounting, mid- or end-point impacts, land use change, allocation, and statistical sampling pools. A state-by-state comparison of GWP lower and higher heating values (LHV, HHV) showed differences attributable to variations in spatial sampling and agricultural practices (e.g., tillage, irrigation). The mean GWP of LHV was 21.1 g·CO-eq MJ including outliers, and median EP LHV and AP LHV was 0.019 g·PO-eq MJ and 0.17 g·SO-eq MJ, respectively, using the limited data available. An LCA case study of South Dakota soybean-based biodiesel production resulted in GWP estimates (29 or 31 g·CO-eq MJ; 100% mono alkyl esters [first generation] biodiesel or 100% fatty acid methyl ester [second generation] biodiesel) similar to meta-analysis results (30.1 g·CO-eq MJ). Meta-analysis mean results, including outliers, resemble the California Low Carbon Fuel Standard for soybean biodiesel default value without land use change of 21.25 g·CO-eq MJ. Results were influenced by resource investment differences in water, fertilizer (e.g., type, application), and tillage. Future biofuel LCA studies should include these important factors to better define reasonable energy variations in regional agricultural management practices. PMID:26437085

  18. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  19. Analysis on electric energy measuring method based on multi-resolution analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-bing; CUI Jia-rui; LIANG Yuan-hua; WANG Mu-kun

    2006-01-01

    Along with the massive applications of the non-linear loads and the impact loads, many non-stationary stochastic signals such as harmonics, inter-harmonics, impulse signals and so on are introduced into the electric network, and these non-stationary stochastic signals have had effects on the accuracy of the measurement of electric energy. The traditional method like Fourier Analysis can be applied efficiently on the stationary stochastic signals, but it has little effect on non-stationary stochastic signals. In light of this, the form of the signals of the electric network in wavelet domain will be discussed in this paper. A measurement method of active power based on multi-resolution analysis in the stochastic process is presented. This method has a wider application scope compared with the traditional method Fourier analysis, and it is of good referential value and practical value in terms of raising the level of the existing electric energy measurement.

  20. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  1. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  2. Subpathway Analysis based on Signaling-Pathway Impact Analysis of Signaling Pathway.

    Directory of Open Access Journals (Sweden)

    Xianbin Li

    Full Text Available Pathway analysis is a common approach to gain insight from biological experiments. Signaling-pathway impact analysis (SPIA is one such method and combines both the classical enrichment analysis and the actual perturbation on a given pathway. Because this method focuses on a single pathway, its resolution generally is not very high because the differentially expressed genes may be enriched in a local region of the pathway. In the present work, to identify cancer-related pathways, we incorporated a recent subpathway analysis method into the SPIA method to form the "sub-SPIA method." The original subpathway analysis uses the k-clique structure to define a subpathway. However, it is not sufficiently flexible to capture subpathways with complex structure and usually results in many overlapping subpathways. We therefore propose using the minimal-spanning-tree structure to find a subpathway. We apply this approach to colorectal cancer and lung cancer datasets, and our results show that sub-SPIA can identify many significant pathways associated with each specific cancer that other methods miss. Based on the entire pathway network in the Kyoto Encyclopedia of Genes and Genomes, we find that the pathways identified by sub-SPIA not only have the largest average degree, but also are more closely connected than those identified by other methods. This result suggests that the abnormality signal propagating through them might be responsible for the specific cancer or disease.

  3. X-ray Rietveld analysis with a physically based background

    International Nuclear Information System (INIS)

    On the basis of known equations for calculating X-ray diffraction intensities from a given number of unit cells of a crystal phase in polycrystalline material, as due to: (i) Bragg reflections; (ii) average diffuse scattering caused by thermal plus first-kind disorder; and (iii) incoherent scattering, a relationship has been found that ties, in the Rietveld analysis, the Bragg scale factor to a scale factor for 'disorder' as well as incoherent scattering. Instead of fitting the background with a polynomial function, it becomes possible to describe the background by physically based equations. Air scattering is included in the background simulation. By this means, the refinement can be carried out with fewer parameters (six fewer than when a fifth-order polynomial is used). The DBWS-9006PC computer program written by Sakthivel and Young [(1990), Georgia Institute of Technology, Atlanta, GA, USA] has been modified to follow this approach and it has been used to refine the crystal structures of the cubic form of Y2O3 and of α-Al2O3. Peak asymmetry has been described by a function based on an exponential approximation. The results from refinements using polynomial physically based background function are, in terms of final structural parameters and reliability indices, very close to each other and in agreement with results reported in the literature. The reconstruction and optimization of the background scattering by means of physically based equations helps the implementation in the Rietveld code of other possible specific diffuse scattering contributions, such as that due to an amorphous phase. (orig.)

  4. Partner Selection Analysis and System Development Based on Gray Relation Analysis for an Agile Virtual Enterprise

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper analyzes the state of the art of partner selection and enumerates the advantage of partner selection based on gray relation analysis comparing to the other algorithms of the partner selection. Furthermore, partner selection system based on gray relation for an Agile Virtual Enterprise(AVE) is analyzed and designed based on the definition and characteristics of the AVE. According to J2EE mode, the architecture of the partner selection system is put forward and the system is developed using JSP, EJB and SQL Server. The paper lays emphasis on a gray relational mathematic model, AVE evaluation infrastructure, a core algorithm of partner selection and a multi-layer gray relation selection process.

  5. Rasch model based analysis of the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Maja Planinic

    2010-03-01

    Full Text Available The Force Concept Inventory (FCI is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct. The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17–18 years. The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4%, indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian, an additional predominantly Newtonian sample (N=141, average FCI score of 64.5% of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further

  6. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  7. Clinical gait data analysis based on Spatio-Temporal features

    CERN Document Server

    Katiyar, Rohit

    2010-01-01

    Analysing human gait has found considerable interest in recent computer vision research. So far, however, contributions to this topic exclusively dealt with the tasks of person identification or activity recognition. In this paper, we consider a different application for gait analysis and examine its use as a means of deducing the physical well-being of people. The proposed method is based on transforming the joint motion trajectories using wavelets to extract spatio-temporal features which are then fed as input to a vector quantiser; a self-organising map for classification of walking patterns of individuals with and without pathology. We show that our proposed algorithm is successful in extracting features that successfully discriminate between individuals with and without locomotion impairment.

  8. Cladistic analysis of iridoviruses based on protein and DNA sequences.

    Science.gov (United States)

    Wang, J W; Deng, R Q; Wang, X Z; Huang, Y S; Xing, K; Feng, J H; He, J G; Long, Q X

    2003-11-01

    Cladograms of iridoviruses were inferred from bootstrap analysis of molecular data sets comprising all published protein and DNA sequences of the major capsid protein, ATPase and DNA polymerase genes of members of the Iridoviridae family Iridovirus. All data sets yielded cladograms supporting the separation of the Iridovirus, Ranavirus and Lymphocystivirus genera, and the cladogram based on data derived from major capsid proteins further divided both the Iridovirus and Ranavirus genera into two groups. Tests of alternative hypotheses of topological constraints were also performed to further investigate relationships between infectious spleen and kidney necrosis virus (ISKNV), an unclassified fish iridovirus for which the complete genome sequence data is available, and other iridoviruses. Cladograms inferred and results of Shimodaira-Hasegawa tests indicated that ISKNV is more closely related to the Ranavirus genus than it is to the other genera of the family.

  9. A Frame-Based Analysis of Synaesthetic Metaphors

    Directory of Open Access Journals (Sweden)

    Hakan Beseoglu

    2008-08-01

    Full Text Available The aim of this paper is to use a frame-based account to explain some empirical findings regarding the accessibility of synaesthetic metaphors. Therefore, some results of empirical studies will be discussed with regard to the question of how much it matters whether the concept of the source domain in a synaesthetic metaphor is a scalar or a quality concept. Furthermore, typed frames are introduced, and it is explained how the notion of a minimal upper attribute can be used in the analysis of adjective-noun compounds. Finally, frames are used to analyze synaesthetic metaphors; it turns out that they offer an adequate basis for the explanation of different accessibility rates found in empirical studies.

  10. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  11. Quaternion-based discriminant analysis method for color face recognition.

    Science.gov (United States)

    Xu, Yong

    2012-01-01

    Pattern recognition techniques have been used to automatically recognize the objects, personal identities, predict the function of protein, the category of the cancer, identify lesion, perform product inspection, and so on. In this paper we propose a novel quaternion-based discriminant method. This method represents and classifies color images in a simple and mathematically tractable way. The proposed method is suitable for a large variety of real-world applications such as color face recognition and classification of the ground target shown in multispectrum remote images. This method first uses the quaternion number to denote the pixel in the color image and exploits a quaternion vector to represent the color image. This method then uses the linear discriminant analysis algorithm to transform the quaternion vector into a lower-dimensional quaternion vector and classifies it in this space. The experimental results show that the proposed method can obtain a very high accuracy for color face recognition. PMID:22937054

  12. Performance analysis of charge plasma based dual electrode tunnel FET

    Science.gov (United States)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  13. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  14. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Directory of Open Access Journals (Sweden)

    Ilse Cervantes

    2013-02-01

    Full Text Available This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system’s outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results.

  15. SILAC-based comparative analysis of pathogenic Escherichia coli secretomes

    DEFF Research Database (Denmark)

    Boysen, Anders; Borch, Jonas; Krogh, Thøger Jensen;

    2015-01-01

    proteome analysis have the potential to discover both classes of proteins and hence form an important tool for discovering therapeutic targets. Adherent-invasive Escherichia coli (AIEC) and Enterotoxigenic E. coli (ETEC) are pathogenic variants of E. coli which cause intestinal disease in humans. AIEC......-term protection are still needed. In order to identify proteins with therapeutic potential, we have used mass spectrometry-based Stable Isotope Labeling with Amino acids in Cell culture (SILAC) quantitative proteomics method which allows us to compare the proteomes of pathogenic strains to commensal E. coli....... In this study, we grew the pathogenic strains ETEC H10407, AIEC LF82 and the non-pathogenic reference strain E. coli K-12 MG1655 in parallel and used SILAC to compare protein levels in OMVs and culture supernatant. We have identified well-known virulence factors from both AIEC and ETEC, thus validating our...

  16. UNRAVELING ECOTOURISM PRACTICE:PROBLEM ANALYSIS BASED ON STAKEHOLDERS

    Institute of Scientific and Technical Information of China (English)

    LIU Xue-mei; BAO Ji-gang

    2004-01-01

    Despite the considerable literatures defined what Ecotourism is or should be, it is experiencing various practices with different features. Now the term "Ecotourism" is almost applied to all tourism activities which are based on nature. Faced to the flooding of those unqualified Ecotourism, it is of great necessity to put forward professional claim. The present writer holds that the key to the realization of rigorous Ecotourism chiefly lies in the relationships among the different interest groups involved in it. So the focus of this paper is just on giving a special analysis to the interest relations between those stakeholders which include local govemment, tour-operators, local residents and eco-tourists, and thus helping to find out what wrong is in those unqualified Ecotourism and the roots of those problems.

  17. Visual traffic jam analysis based on trajectory data.

    Science.gov (United States)

    Wang, Zuchao; Lu, Min; Yuan, Xiaoru; Zhang, Junping; van de Wetering, Huub

    2013-12-01

    In this work, we present an interactive system for visual analysis of urban traffic congestion based on GPS trajectories. For these trajectories we develop strategies to extract and derive traffic jam information. After cleaning the trajectories, they are matched to a road network. Subsequently, traffic speed on each road segment is computed and traffic jam events are automatically detected. Spatially and temporally related events are concatenated in, so-called, traffic jam propagation graphs. These graphs form a high-level description of a traffic jam and its propagation in time and space. Our system provides multiple views for visually exploring and analyzing the traffic condition of a large city as a whole, on the level of propagation graphs, and on road segment level. Case studies with 24 days of taxi GPS trajectories collected in Beijing demonstrate the effectiveness of our system.

  18. First law-based thermodynamic analysis on Kalina cycle

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the first law of thermodynamics,and adopting the Peng-Robinson equation (P-R equation) as the basic equation for the properties of ammonia-water mixtures,a thermodynamic analysis on a single-stage distillation Kalina cycle is presented.A program to calculate the thermodynamic properties of ammoniawater mixtures,and that for calculating the performance of Kalina cycles,were developed,with which the heatwork conversion particulars of Kalina cycles were theoretically calculated.The influences on the cycle performance of key parameters,such as the pressure and temperature at the inlet of the turbine,the back pressure of the turbine,the concentration of the working solution,the concentration of the basic solution and the cycle multiplication ratio,were analyzed.

  19. Fuzzy MCDM Based on Fuzzy Relational Degree Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a new fuzzy multiple criteria (both qualitative and quantitative) decision-making (MCDM) method based on fuzzy relational degree analysis. The concepts of fuzzy set theory are used to construct a weighted suitability decision matrix to evaluate the weighted suitability of different alternatives versus various criteria. The positive ideal solution and negative ideal solution are then obtained by using a method of ranking fuzzy numbers, and the fuzzy relational degrees of different alternatives versus positive ideal solution and negative ideal solution are calculated by using the proposed arithmetic. Finally, the relative relational degrees of various alternatives versus positive ideal solution are ranked to determine the best alternative. A numerical example is provided to illustrate the proposed method at the end of this paper.

  20. Windows Volatile Memory Forensics Based on Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2014-03-01

    Full Text Available In this paper, we present an integrated memory forensic solution for multiple Windows memory images. By calculation, the method can find out the correlation degree among the processes of volatile memory images and the hidden clues behind the events of computers, which is usually difficult to be obtained and easily ignored by analyzing one single memory image and forensic investigators. In order to test the validity, we performed an experiment based on two hosts' memory image which contains criminal incidents. According to the experimental result, we find that the event chains reconstructed by our method are similar to the actual actions in the criminal scene. Investigators can review the digital crime scenario which is contained in the data set by analyzing the experimental results. This paper is aimed at finding the valid actions with illegal attempt and making the memory analysis not to be utterly dependent on the operating system and relevant experts.

  1. Diversity analysis for Magnaporthe grisea by Pot2_based PCR

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@CO39, its near_isogenic lines, having single blast resistance gene, C101LAC, C101A51, C104PKT, and C101PKT, and its resistance gene pyramid lines BL121, BL241, and A57_119, grew in the blast nursery at IRRI. The seeds were sown in four batches with two weeks interval between batches, using IR50 and IR72 as spread rows and susceptible controls. For moist leaf, spraying water was proceeded every two hours from 8:30 am in sunny days. Blast disease was scored and pathogen was isolated every two weeks. DNA samples of 310 isolates were used for diversity analysis by Pot 2_based PCR (Pot 2, a dispersed retrotransposon of the fungus).

  2. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  3. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  4. Analysis of equivalent antenna based on FDTD method

    Institute of Scientific and Technical Information of China (English)

    Yun-xing YANG; Hui-chang ZHAO; Cui DI

    2014-01-01

    An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD) method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is air)takes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  5. Similarity theory based method for MEMS dynamics analysis

    Institute of Scientific and Technical Information of China (English)

    LI Gui-xian; PENG Yun-feng; ZHANG Xin

    2008-01-01

    A new method for MEMS dynamics analysis is presented, ased on the similarity theory. With this method, two systems' similarities can be captured in terms of physics quantities/governed-equations amongst different energy fields, and then the unknown dynamic characteristics of one of the systems can be analyzed ac-cording to the similar ones of the other system. The probability to establish a pair of similar systems among MEMS and other energy systems is also discussed based on the equivalent between mechanics and electrics, and then the feasibility of applying this method is proven by an example, in which the squeezed damping force in MEMS and the current of its equivalent circuit established by this method are compared.

  6. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  7. GIS Based Spatial Data Analysis for Landslide Susceptibility Mapping

    Institute of Scientific and Technical Information of China (English)

    S.Sarkar; D.P.Kanungo; A.K.Patra; Pushpendra Kumar

    2008-01-01

    Landslide susceptibility map delineates the potential zones for landslides occurrence.The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Sikkim Himalaya.Six important causative factors for landslide occurrences were selected and corresponding thematic data layers were prepared in GIS.Topographic maps,satellite image,field data and published maps constitute the input data for thematic layer preparation.Numerical weights for different categories of these factors were determined based on a statistical approach and the weighted thematic layers were integrated in GIS environment to generate the landslide susceptibility map of the area.The landslide susceptibility map classifies the area into five different landslide susceptible zones i.e.,very high,high,moderate,low and very low.This map was validated using the existing landslide distribution in the area.

  8. Fingerprint image segmentation based on multi-features histogram analysis

    Science.gov (United States)

    Wang, Peng; Zhang, Youguang

    2007-11-01

    An effective fingerprint image segmentation based on multi-features histogram analysis is presented. We extract a new feature, together with three other features to segment fingerprints. Two of these four features, each of which is related to one of the other two, are reciprocals with each other, so features are divided into two groups. These two features' histograms are calculated respectively to determine which feature group is introduced to segment the aim-fingerprint. The features could also divide fingerprints into two classes with high and low quality. Experimental results show that our algorithm could classify foreground and background effectively with lower computational cost, and it can also reduce pseudo-minutiae detected and improve the performance of AFIS.

  9. Analysis of equivalent antenna based on FDTD method

    Directory of Open Access Journals (Sweden)

    Yun-xing Yang

    2014-09-01

    Full Text Available An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is airtakes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  10. Iris recognition based on robust principal component analysis

    Science.gov (United States)

    Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong

    2014-11-01

    Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.

  11. Architecture Analysis of an FPGA-Based Hopfield Neural Network

    Directory of Open Access Journals (Sweden)

    Miguel Angelo de Abreu de Sousa

    2014-01-01

    Full Text Available Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.

  12. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  13. Identification and annotation of erotic film based on content analysis

    Science.gov (United States)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  14. Choosing a Commercial Broiler Strain Based on Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Hosseini SA

    2014-05-01

    Full Text Available With the complexity and amount of information in a wide variety of comparative performance reports in poultry production, making a decision is difficult. This problem is overcomed only when all data can be put into a common unit. For this purpose, five different decision making analysis approaches including  Maximin, Equally likely, Weighted average, Ordered weighted averages and Technique for order preference by similarity to ideal solution were used to choose the best broiler strain among three ones based on their comparative performance and carcass characteristics. Commercial broiler strains of 6000 designated as R, A, and C (each strain 2000 were randomly allocated into three treatments of five replicates. In this study, all methods showed similar results except Maximin approach. Comparing different methods indicated that strain C with the highest world share market has the best performance followed by strains R and A.

  15. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  16. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  17. Entropy-based model for miRNA isoform analysis.

    Directory of Open Access Journals (Sweden)

    Shengqin Wang

    Full Text Available MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test, we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.

  18. Atomic force microscopy-based shape analysis of heart mitochondria.

    Science.gov (United States)

    Lee, Gi-Ja; Park, Hun-Kuk

    2015-01-01

    Atomic force microscopy (AFM) has become an important medical and biological tool for the noninvasive imaging of cells and biomaterials in medical, biological, and biophysical research. The major advantages of AFM over conventional optical and electron microscopes for bio-imaging include the facts that no special coating is required and that imaging can be done in all environments-air, vacuum, or aqueous conditions. In addition, it can also precisely determine pico-nano Newton force interactions between the probe tip and the sample surface from force-distance curve measurements.It is widely known that mitochondrial swelling is one of the most important indicators of the opening of the mitochondrial permeability transition (MPT) pore. As mitochondrial swelling is an ultrastructural change, quantitative analysis of this change requires high-resolution microscopic methods such as AFM. Here, we describe the use of AFM-based shape analysis for the characterization of nanostructural changes in heart mitochondria resulting from myocardial ischemia-reperfusion injury. PMID:25634291

  19. GIS-BASED SURFACE ANALYSIS OF ARCHAEOLOGICAL FINDS

    Directory of Open Access Journals (Sweden)

    K. Kovács

    2012-09-01

    Full Text Available The international research project HiMAT (History of Mining Activities in the Tyrol and adjacent areas is dedicated to the study of mining history in the Eastern Alps by various scientific disciplines. The aim of this program is the analysis of the mining activities’ impacts on environment and human societies. Unfortunately, there is only a limited number of specific regions (e.g. Mitterberg to offer possibilities to investigate the former mining expansions. Within this multidisciplinary project, the archaeological sites and finds are analyzed by the Surveying and Geoinformation Unit at the University of Innsbruck. This paper shows data fusion of different surveying and post-processing methods to achieve a photo-realistic digital 3D model of one of these most important finds, the Bronze Age sluice box from the Mitterberg. The applied workflow consists of four steps: 1. Point cloud processing, 2. Meshing of the point clouds and editing of the models, 3. Image orientation, bundle and image adjustment, 4. Model texturing. In addition, a short range laser scanning survey was organized before the conservation process of this wooden find. More accurate research opportunities were offered after this detailed documentation of the sluice box, for example the reconstruction of the broken parts and the surface analysis of this archaeological object were implemented using these high-resolution datasets. In conclusion, various unperceived patterns of the wooden boards were visualized by the GIS-based tool marks investigation.

  20. Aroma characterization based on aromatic series analysis in table grapes.

    Science.gov (United States)

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  1. Aroma characterization based on aromatic series analysis in table grapes

    Science.gov (United States)

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  2. Web-based analysis of the mouse transcriptome using Genevestigator

    Directory of Open Access Journals (Sweden)

    Gruissem Wilhelm

    2006-06-01

    Full Text Available Abstract Background Gene function analysis often requires a complex and laborious sequence of laboratory and computer-based experiments. Choosing an effective experimental design generally results from hypotheses derived from prior knowledge or experimentation. Knowledge obtained from meta-analyzing compendia of expression data with annotation libraries can provide significant clues in understanding gene and network function, resulting in better hypotheses that can be tested in the laboratory. Description Genevestigator is a microarray database and analysis system allowing context-driven queries. Simple but powerful tools allow biologists with little computational background to retrieve information about when, where and how genes are expressed. We manually curated and quality-controlled 3110 mouse Affymetrix arrays from public repositories. Data queries can be run against an annotation library comprising 160 anatomy categories, 12 developmental stage groups, 80 stimuli, and 182 genetic backgrounds or modifications. The quality of results obtained through Genevestigator is illustrated by a number of biological scenarios that are substantiated by other types of experimentation in the literature. Conclusion The Genevestigator-Mouse database effectively provides biologically meaningful results and can be accessed at https://www.genevestigator.ethz.ch.

  3. [Environmental impacts of sewage treatment system based on emergy analysis].

    Science.gov (United States)

    Li, Min; Zhang, Xiao-Hong; Li, Yuan-Wei; Zhang, Hong; Zhao, Min; Deng, Shi-Huai

    2013-02-01

    "Integrated sewage treatment system" (ISTS) consists of sewage treatment plant system and their products (treated water and dewatered sludge) disposal facilities, which gives a holistic view of the whole sewage treatment process. During its construction and operation, ISTS has two main impacts on the environment, i.e., the consumption of resources and the damage of discharged pollutants on the environment, while the latter was usually ignored by the previous researchers when they assessed the impacts of wastewater treatment system. In order to more comprehensively understanding the impacts of sewage treatment on the environment, an analysis was made on the ISTS based on the theories of emergy analysis, and, in combining with ecological footprint theory, the sustainability of the ISTS was also analyzed. The results showed that the emergy of the impacts of water pollutants on the environment was far larger than that of the impacts of air pollutants, and NH3-N was the main responsible cause. The emergy consumption of ISTS mainly came from the emergy of wastewater and of local renewable resources. The "sewage treatment plant system + landfill system" had the highest emergy utilization efficiency, while the "sewage treatment plant system + reclaimed water reuse system + incineration system" had the lowest one. From the aspect of environmental sustainability, the "sewage treatment plant system + reclaimed water reuse system + landfill system" was the best ISTS, while the "sewage treatment plant system + incineration system" was the worst one.

  4. Web-based analysis of nasal sound spectra.

    Science.gov (United States)

    Seren, Erdal

    2005-10-01

    The spectral analysis of the nasal sound is an indicator of the nasal airflow pattern. We investigated a new technique for nasal sound analysis via Internet. This study includes 27 patients and 22 healthy people. Patients were treated by septoplasty operation for septal deviation. Postoperation 10(th) day, this technique was applied to follow nasal airflow course. The patients recorded the nasal sound by microphone into the computer as a .wav file and sent us via internet, all those records were evaluated by us. The results were sent back to themselves. The 11 patients who had nasal obstruction symptoms (group A) were called to the hospital to check. In the nasal sound analyses e-mails of those patients, the sound intensity was at high frequencies (2-4 kHz, 4-6 kHz) above 30 dB, but low (500-1000 Hz) and medium frequencies (1-2 kHz), are below then 10 dB. In the patients without nasal obstruction symptom (group B), the sound intensity was at high frequencies below 10 dB, but low and medium frequencies are above 20 dB. There was a statistically significant difference in sound intensity between group A and group B. In the endoscopical examination of those obstructions, which decreases the nasal airway, crusting formation in the nasal cavity was found. Web-based nasal sound analysis is an important method to follow the postoperative course and the nasal airflow evaluation. The new method will save time and money, avoiding a return visit to the hospital unnecessarily.

  5. Patent portfolio analysis model based on legal status information

    Institute of Scientific and Technical Information of China (English)

    Xuezhao; WANG; Yajuan; ZHAO; Jing; ZHANG; Ping; ZHAO

    2014-01-01

    Purpose:This research proposes a patent portfolio analysis model based on the legal status information to chart out a competitive landscape in a particular field,enabling organizations to position themselves within the overall technology landscape.Design/methodology/approach:Three indicators were selected for the proposed model:Patent grant rate,valid patents rate and patent maintenance period.The model uses legal status information to perform a qualitative evaluation of relative values of the individual patents,countries or regions’ technological capabilities and competitiveness of patent applicants.The results are visualized by a four-quadrant bubble chart To test the effectiveness of the model,it is used to present a competitive landscape in the lithium ion battery field.Findings:The model can be used to evaluate the values of the individual patents,highlight countries or regions’ positions in the field,and rank the competitiveness of patent applicants in the field.Research limitations:The model currently takes into consideration only three legal status indicators.It is actually feasible to introduce more indicators such as the reason for invalid patents and the distribution of patent maintenance time and associate them with those in the proposed model.Practical implications:Analysis of legal status information in combination of patent application information can help an organization to spot gaps in its patent claim coverage,as well as evaluate patent quality and maintenance situation of its granted patents.The study results can be used to support technology assessment,technology innovation and intellectual property management.Originality/value:Prior studies attempted to assess patent quality or competitiveness by using either single patent legal status indicator or comparative analysis of the impacts of each indicator.However,they are insufficient in presenting the combined effects of the evaluation indicators.Using our model,it appears possible to get a

  6. Performance based analysis of hidden beams in reinforced concrete structures

    Directory of Open Access Journals (Sweden)

    Helou Samir H.

    2014-01-01

    Full Text Available Local and perhaps regional vernacular reinforced concrete building construction leans heavily against designing slabs with imbedded hidden beams for flooring systems in most structures including major edifices. The practice is distinctive in both framed and in shear wall structures. Hidden beams are favoured structural elements due to their many inherent features that characterize them; they save on floor height clearance; they also save on formwork, labour and material cost. Moreover, hidden beams form an acceptable aesthetic appearance that does not hinder efficient interior space partitioning. Such beams have the added advantage of clearing the way for horizontal electromechanical ductwork. However, seismic considerations, in all likelihood, are seldom seriously addressed. The mentioned structural system of shallow beams is adopted in ribbed slabs, waffle slabs and at times with solid slabs. Ribbed slabs and waffle slabs are more prone to hidden beam inclusion due to the added effective height of the concrete section. Due to the presence of a relatively high reinforcement ratio at the joints the sections at such location tend to become less ductile with unreliable contribution to spandrel force resistance. In the following study the structural influence of hidden beams within slabs is investigated. With the primary focus on a performance based analysis of such elements within a structure. This is investigated with due attention to shear wall contribution to the overall behaviour of such structures. Numerical results point in the direction that the function of hidden beams is not as adequate as desired. Therefore it is strongly believed that they are generally superfluous and maybe eliminated altogether. Conversely, shallow beams seem to render the overall seismic capacity of the structure unreliable. Since such an argument is rarely manifested within the linear analysis domain; a pushover analysis exercise is thus mandatory for behaviour

  7. Skull base chordomas: analysis of dose-response characteristics

    International Nuclear Information System (INIS)

    Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate

  8. A Framework for Geographic Object-Based Image Analysis (GEOBIA) based on geographic ontology

    Science.gov (United States)

    Gu, H. Y.; Li, H. T.; Yan, L.; Lu, X. J.

    2015-06-01

    GEOBIA (Geographic Object-Based Image Analysis) is not only a hot topic of current remote sensing and geographical research. It is believed to be a paradigm in remote sensing and GIScience. The lack of a systematic approach designed to conceptualize and formalize the class definitions makes GEOBIA a highly subjective and difficult method to reproduce. This paper aims to put forward a framework for GEOBIA based on geographic ontology theory, which could implement "Geographic entities - Image objects - Geographic objects" true reappearance. It consists of three steps, first, geographical entities are described by geographic ontology, second, semantic network model is built based on OWL(ontology web language), at last, geographical objects are classified with decision rule or other classifiers. A case study of farmland ontology was conducted for describing the framework. The strength of this framework is that it provides interpretation strategies and global framework for GEOBIA with the property of objective, overall, universal, universality, etc., which avoids inconsistencies caused by different experts' experience and provides an objective model for mage analysis.

  9. Using Metadata Analysis and Base Analysis Techniques in Data Qualities Framework for Data Warehouses

    Directory of Open Access Journals (Sweden)

    Azwa A. Aziz

    2011-01-01

    Full Text Available Information provided by any applications systems in organization is vital in order to obtain a decision. Due to this factor, the quality of data provided by Data Warehouse (DW is really important for organization to produce the best solution for their company to move forwards. DW is complex systems that have to deliver highly-aggregated, high quality data from heterogeneous sources to decision makers. It involves a lot of integration of sources system to support business operations. Problem statement: Many of DW projects are failed because of Data Quality (DQ problems. DQ issues become a major concern over decade. Approach: This study proposes a framework for implementing DQ in DW system architecture using Metadata Analysis Technique and Base Analysis Technique. Those techniques perform comparison between target values and current values gain from the systems. A prototype using PHP is develops to support Base Analysis Techniques. Then a sample schema from Oracle database is used to study differences between applying the framework or not. The prototype is demonstrated to the selected organizations to identify whether it will help to reduce DQ problems. Questionnaires have been given to respondents. Results: The result show user interested in applying DQ processes in their organizations. Conclusion/Recommendation: The implementation of the framework suggested in real situation need to be conducted to obtain more accurate result.

  10. A LCA Based Biofuel Supply Chain Analysis Framework

    Institute of Scientific and Technical Information of China (English)

    刘喆轩; 邱彤; 陈丙珍

    2014-01-01

    This paper presents a life cycle assessment (LCA) based biofuel supply chain (SC) analysis framework which enables the study of economic, energy and environmental (3E) performances by using multi-objective opti-mization. The economic objective is measured by the total annual profit, the energy objective is measured by the average fossil energy (FE) inputs per MJ biofuel and the environmental objective is measured by greenhouse gas (GHG) emissions per MJ biofuel. A multi-objective linear fractional programming (MOLFP) model with multi-conversion pathways is formulated based on the framework and is solved by using theε-constraint method. The MOLFP prob-lem is turned into a mixed integer linear programming (MILP) problem by setting up the total annual profit as the optimization objective and the average FE inputs per MJ biofuel and GHG emissions per MJ biofuel as constraints. In the case study, this model is used to design an experimental biofuel supply chain in China. A set of the weekly Pareto optimal solutions is obtained. Each non-inferior solution indicates the optimal locations and the amount of biomass produced, locations and capacities of conversion factories, locations and amount of biofuel being supplied in final markets and the flow of mass through the supply chain network (SCN). As the model reveals trade-offs among 3E criteria, we think the framework can be a good support tool of decision for the design of biofuel SC.

  11. Perceptual security of encrypted images based on wavelet scaling analysis

    Science.gov (United States)

    Vargas-Olmos, C.; Murguía, J. S.; Ramírez-Torres, M. T.; Mejía Carlos, M.; Rosu, H. C.; González-Aguilar, H.

    2016-08-01

    The scaling behavior of the pixel fluctuations of encrypted images is evaluated by using the detrended fluctuation analysis based on wavelets, a modern technique that has been successfully used recently for a wide range of natural phenomena and technological processes. As encryption algorithms, we use the Advanced Encryption System (AES) in RBT mode and two versions of a cryptosystem based on cellular automata, with the encryption process applied both fully and partially by selecting different bitplanes. In all cases, the results show that the encrypted images in which no understandable information can be visually appreciated and whose pixels look totally random present a persistent scaling behavior with the scaling exponent α close to 0.5, implying no correlation between pixels when the DFA with wavelets is applied. This suggests that the scaling exponents of the encrypted images can be used as a perceptual security criterion in the sense that when their values are close to 0.5 (the white noise value) the encrypted images are more secure also from the perceptual point of view.

  12. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  13. IMU-Based Joint Angle Measurement for Gait Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Seel

    2014-04-01

    Full Text Available This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1 joint axis and position identification; and (2 flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  14. Active contour model based on force field analysis

    Institute of Scientific and Technical Information of China (English)

    HOU Zhi-qiang; HAN Chong-zhao

    2006-01-01

    The traditional snake initial contour should be close to the true boundary of an object of interest in an image;otherwise,an incorrect result will be obtained.Next,active contours have difficulties progressing into boundary concavities.Moreover,the traditional snake as well as almost all of its improved methods can be easily obtained from the local minimum because snake models are nonconvex.An active contour model based on force field analysis (FFA),namely,FFA snake model,is presented in this paper.Based on analyzing force distribution rules of the distance potential force field,a standard is introduced here to distinguish the false one from contour points.The result is not considered as the final solution when the snake energy is minimal.Furthermore,estimation and calculation should be made according to the established standard;only then can the result be considered final.Thus,the snake is prevented from running into the local minimum.The simulation results show that the FFA snake model has a large capture range and can move a snake into the boundary concavities,and that it is able to obtain the object of interest's contour precisely.Compared with the gradient vector flow snake,this new model has a low computational cost.

  15. Wavelet-based multifractal analysis of laser biopsy imagery

    CERN Document Server

    Jagtap, Jaidip; Panigrahi, Prasanta K; Pradhan, Asima

    2011-01-01

    In this work, we report a wavelet based multi-fractal study of images of dysplastic and neoplastic HE- stained human cervical tissues captured in the transmission mode when illuminated by a laser light (He-Ne 632.8nm laser). It is well known that the morphological changes occurring during the progression of diseases like cancer manifest in their optical properties which can be probed for differentiating the various stages of cancer. Here, we use the multi-resolution properties of the wavelet transform to analyze the optical changes. For this, we have used a novel laser imagery technique which provides us with a composite image of the absorption by the different cellular organelles. As the disease progresses, due to the growth of new cells, the ratio of the organelle to cellular volume changes manifesting in the laser imagery of such tissues. In order to develop a metric that can quantify the changes in such systems, we make use of the wavelet-based fluctuation analysis. The changing self- similarity during di...

  16. Analysis of Orthogonal Cutting of Aluminium-based Composites

    Directory of Open Access Journals (Sweden)

    P. Ravinder Reddy

    2002-10-01

    Full Text Available A turning test on aluminium-based metal-matrix composites (MMCs (aluminium-30% silicon carbide was performed with K-20 carbide tool material and wear patterns and the wear land growth rates were analysed to evaluate the wear characteristics and to classify the relationship between the physical (mechanical properties and the flank wear of cutting tools. The study was also extended to the machining aspects and the width of cuts on MMCs and the influence of various cutting parameters. The experiments were conducted to measure the temperature along the cutting tool edge using thermocouple at various cutting speeds, and depth of cuts, keeping the feed rate constant while turning with K-20 carbide cutting tool. The finite-element method was used to simulate the orthogonal cutting of aluminium-based MMCs. The heat generation at the chip-tool interface, frictional heat generation at the tool flank, and the heat generation at the work tool interface were calculated analytically and imposed as boundary conditions. The analysis of the steady-state heat transfer was carried out and the temperature distribution at cutting edge, shear zone, and interface regions have been reported.

  17. Performance of Water-Based Liquid Scintillator: An Independent Analysis

    Directory of Open Access Journals (Sweden)

    D. Beznosko

    2014-01-01

    Full Text Available The water-based liquid scintillator (WbLS is a new material currently under development. It is based on the idea of dissolving the organic scintillator in water using special surfactants. This material strives to achieve the novel detection techniques by combining the Cerenkov rings and scintillation light, as well as the total cost reduction compared to pure liquid scintillator (LS. The independent light yield measurement analysis for the light yield measurements using three different proton beam energies (210 MeV, 475 MeV, and 2000 MeV for water, two different WbLS formulations (0.4% and 0.99%, and pure LS conducted at Brookhaven National Laboratory, USA, is presented. The results show that a goal of ~100 optical photons/MeV, indicated by the simulation to be an optimal light yield for observing both the Cerenkov ring and the scintillation light from the proton decay in a large water detector, has been achieved.

  18. Code Based Analysis for Object-Oriented Systems

    Institute of Scientific and Technical Information of China (English)

    Swapan Bhattacharya; Ananya Kanjilal

    2006-01-01

    The basic features of object-oriented software makes it difficult to apply traditional testing methods in objectoriented systems. Control Flow Graph (CFG) is a well-known model used for identification of independent paths in procedural software. This paper highlights the problem of constructing CFG in object-oriented systems and proposes a new model named Extended Control Flow Graph (ECFG) for code based analysis of Object-Oriented (OO) software. ECFG is a layered CFG where nodes refer to methods rather than statements. A new metrics - Extended Cyclomatic Complexity (E-CC) is developed which is analogous to McCabe's Cyclomatic Complexity (CC) and refers to the number of independent execution paths within the OO software. The different ways in which CFG's of individual methods are connected in an ECFG are presented and formulas for E-CC for these different cases are proposed. Finally we have considered an example in Java and based on its ECFG, applied these cases to arrive at the E-CC of the total system as well as proposed a methodology for calculating the basis set, i.e., the set of independent paths for the OO system that will help in creation of test cases for code testing.

  19. Object-based Analysis for Extraction of Dominant Tree Species

    Institute of Scientific and Technical Information of China (English)

    Meiyun; SHAO; Xia; JING; Lu; WANG

    2015-01-01

    As forest is of great significance for our whole development and the sustainable plan is so focus on it. It is very urgent for us to have the whole distribution,stock volume and other related information about that. So the forest inventory program is on our schedule. Aiming at dealing with the problem in extraction of dominant tree species,we tested the highly hot method-object-based analysis. Based on the ALOS image data,we combined multi-resolution in e Cognition software and fuzzy classification algorithm. Through analyzing the segmentation results,we basically extract the spruce,the pine,the birch and the oak of the study area. Both the spectral and spatial characteristics were derived from those objects,and with the help of GLCM,we got the differences of each species. We use confusion matrix to do the Classification accuracy assessment compared with the actual ground data and this method showed a comparatively good precision as 87% with the kappa coefficient 0. 837.

  20. An Ultrasound Image Despeckling Approach Based on Principle Component Analysis

    Directory of Open Access Journals (Sweden)

    Jawad F. Al-Asad

    2014-07-01

    Full Text Available An approach based on principle component analysis (PCA to filter out multiplicative noise from ultrasound images is presented in this paper. An image with speckle noise is segmented into small dyadic lengths, depending on the original size of the image, and the global covariance matrix is found. A projection matrix is then formed by selecting the maximum eigenvectors of the global covariance matrix. This projection matrix is used to filter speckle noise by projecting each segment into the signal subspace. The approach is based on the assumption that the signal and noise are independent and that the signal subspace is spanned by a subset of few principal eigenvectors. When applied on simulated and real ultrasound images, the proposed approach has outperformed some popular nonlinear denoising techniques such as 2D wavelets, 2D total variation filtering, and 2D anisotropic diffusion filtering in terms of edge preservation and maximum cleaning of speckle noise. It has also showed lower sensitivity to outliers resulting from the log transformation of the multiplicative noise.