WorldWideScience

Sample records for based model-free analysis

  1. Model-free functional MRI analysis using cluster-based methods

    Science.gov (United States)

    Otto, Thomas D.; Meyer-Baese, Anke; Hurdal, Monica; Sumners, DeWitt; Auer, Dorothee; Wismuller, Axel

    2003-08-01

    Conventional model-based or statistical analysis methods for functional MRI (fMRI) are easy to implement, and are effective in analyzing data with simple paradigms. However, they are not applicable in situations in which patterns of neural response are complicated and when fMRI response is unknown. In this paper the "neural gas" network is adapted and rigorously studied for analyzing fMRI data. The algorithm supports spatial connectivity aiding in the identification of activation sites in functional brain imaging. A comparison of this new method with Kohonen's self-organizing map and with a minimal free energy vector quantizer is done in a systematic fMRI study showing comparative quantitative evaluations. The most important findings in this paper are: (1) the "neural gas" network outperforms the other two methods in terms of detecting small activation areas, and (2) computed reference function several that the "neural gas" network outperforms the other two methods. The applicability of the new algorithm is demonstrated on experimental data.

  2. Model-free model elimination: A new step in the model-free dynamic analysis of NMR relaxation data

    International Nuclear Information System (INIS)

    Model-free analysis is a technique commonly used within the field of NMR spectroscopy to extract atomic resolution, interpretable dynamic information on multiple timescales from the R1, R2, and steady state NOE. Model-free approaches employ two disparate areas of data analysis, the discipline of mathematical optimisation, specifically the minimisation of a χ2 function, and the statistical field of model selection. By searching through a large number of model-free minimisations, which were setup using synthetic relaxation data whereby the true underlying dynamics is known, certain model-free models have been identified to, at times, fail. This has been characterised as either the internal correlation times, τe, τf, or τs, or the global correlation time parameter, local τm, heading towards infinity, the result being that the final parameter values are far from the true values. In a number of cases the minimised χ2 value of the failed model is significantly lower than that of all other models and, hence, will be the model which is chosen by model selection techniques. If these models are not removed prior to model selection the final model-free results could be far from the truth. By implementing a series of empirical rules involving inequalities these models can be specifically isolated and removed. Model-free analysis should therefore consist of three distinct steps: model-free minimisation, model-free model elimination, and finally model-free model selection. Failure has also been identified to affect the individual Monte Carlo simulations used within error analysis. Each simulation involves an independent randomised relaxation data set and model-free minimisation, thus simulations suffer from exactly the same types of failure as model-free models. Therefore, to prevent these outliers from causing a significant overestimation of the errors the failed Monte Carlo simulations need to be culled prior to calculating the parameter standard deviations

  3. Model-free data analysis for source separation based on Non-Negative Matrix Factorization and k-means clustering (NMFk)

    Science.gov (United States)

    Vesselinov, V. V.; Alexandrov, B.

    2014-12-01

    The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the

  4. Dopamine enhances model-based over model-free choice behavior.

    OpenAIRE

    Wunderlich, K; Smittenaar, P.; Dolan, R J

    2012-01-01

    Summary Decision making is often considered to arise out of contributions from a model-free habitual system and a model-based goal-directed system. Here, we investigated the effect of a dopamine manipulation on the degree to which either system contributes to instrumental behavior in a two-stage Markov decision task, which has been shown to discriminate model-free from model-based control. We found increased dopamine levels promote model-based over model-free choice.

  5. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Simon Benjaminsson

    2010-08-01

    Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.

  6. The effect of polymer matrices on the thermal hazard properties of RDX-based PBXs by using model-free and combined kinetic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Qi-Long, E-mail: terry.well@163.com [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zeman, Svatopluk, E-mail: svatopluk.zeman@upce.cz [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Sánchez Jiménez, P.E. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zhao, Feng-Qi [Science and Technology on Combustion and Explosion Laboratory, Xi’an Modern Chemistry Research Institute, 710065 Xi’an (China); Pérez-Maqueda, L.A. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Málek, Jiří [Department of Physical Chemistry, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic)

    2014-04-01

    Highlights: • Nonisothermal decomposition kinetics of RDX and its PBXs has been investigated. • The kinetic models are determined by both master plot and combined kinetic analysis methods. • The constant rate temperature profiles and isothermal curves are predicted by obtained kinetic triplets. • The storage safety parameters are simulated based on thermal explosion theory. - Abstract: In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82 °C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami–Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82 °C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature.

  7. The effect of polymer matrices on the thermal hazard properties of RDX-based PBXs by using model-free and combined kinetic analysis.

    Science.gov (United States)

    Yan, Qi-Long; Zeman, Svatopluk; Sánchez Jiménez, P E; Zhao, Feng-Qi; Pérez-Maqueda, L A; Málek, Jiří

    2014-04-30

    In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82°C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami-Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82°C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature. PMID:24657941

  8. Application of model-free kinetics to the study of dehydration of fly ash-based zeolite

    International Nuclear Information System (INIS)

    In the present paper, dehydration kinetics of zeolite Na-A synthesized from fly ash was investigated by means of thermogravimetric analysis. Na-A zeolite was formed from coal fly fash by fusion with sodium hydroxide and succeeding hydrothermal treatment at 100 deg. C after induction period. The model-free kinetic method was applied to calculate the activation energy of the dehydration process of fly ash-based zeolite as a function of conversion and temperature. The Vyazovkin model-free kinetic method also enabled the definition of time, necessary to remove water molecules from the zeolite structure for a given temperature

  9. The use of model selection in the model-free analysis of protein dynamics

    International Nuclear Information System (INIS)

    Model-free analysis of NMR relaxation data, which is widely used for the study of protein dynamics, consists of the separation of the global rotational diffusion from internal motions relative to the diffusion frame and the description of these internal motions by amplitude and timescale. Five model-free models exist, each of which describes a different type of motion. Model-free analysis requires the selection of the model which best describes the dynamics of the NH bond. It will be demonstrated that the model selection technique currently used has two significant flaws, under-fitting, and not selecting a model when one ought to be selected. Under-fitting breaks the principle of parsimony causing bias in the final model-free results, visible as an overestimation of S2 and an underestimation of τe and Rex. As a consequence the protein falsely appears to be more rigid than it actually is. Model selection has been extensively developed in other fields. The techniques known as Akaike's Information Criteria (AIC), small sample size corrected AIC (AICc), Bayesian Information Criteria (BIC), bootstrap methods, and cross-validation will be compared to the currently used technique. To analyse the variety of techniques, synthetic noisy data covering all model-free motions was created. The data consists of two types of three-dimensional grid, the Rex grids covering single motions with chemical exchange {S2,τe,Rex}, and the Double Motion grids covering two internal motions {Sf2,Ss2,τs}. The conclusion of the comparison is that for accurate model-free results, AIC model selection is essential. As the method neither under, nor over-fits, AIC is the best tool for applying Occam's razor and has the additional benefits of simplifying and speeding up model-free analysis

  10. Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code

    DEFF Research Database (Denmark)

    Zhang, Yunqian; Chen, Zhe; Cheng, Ming;

    2011-01-01

    As the wind turbine is a nonlinear high-order system, to achieve good pitch control performance, model free adaptive control (MFAC) approach which doesn't need the mathematical model of the wind turbine is adopted in the pitch control system in this paper. A pseudo gradient vector whose estimation...... value is only based on I/O data of the wind turbine is identified and then the wind turbine system is replaced by a dynamic linear time-varying model. In order to verify the correctness and robustness of the proposed model free adaptive pitch controller, the wind turbine code FAST which can predict the...

  11. Gaze data reveal distinct choice processes underlying model-based and model-free reinforcement learning.

    Science.gov (United States)

    Konovalov, Arkady; Krajbich, Ian

    2016-01-01

    Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383

  12. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    Directory of Open Access Journals (Sweden)

    Anya eSkatova

    2013-09-01

    Full Text Available Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, versus another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans’ scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.

  13. Model free approach to kinetic analysis of real-time hyperpolarized 13C magnetic resonance spectroscopy data.

    Directory of Open Access Journals (Sweden)

    Deborah K Hill

    Full Text Available Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized (13C metabolic imaging in humans, where measurement of the input function can be problematic.

  14. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    Directory of Open Access Journals (Sweden)

    Zhe Dong

    2016-01-01

    Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.

  15. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  16. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  17. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S.; Kitao, A.; Berendsen, H.J.C.

    1997-01-01

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by m

  18. A Model-free Approach to Fault Detection of Continuous-time Systems Based on Time Domain Data

    Institute of Scientific and Technical Information of China (English)

    Ping Zhang; Steven X. Ding

    2007-01-01

    In this paper, a model-free approach is presented to design an observer-based fault detection system of linear continuoustime systems based on input and output data in the time domain. The core of the approach is to directly identify parameters of the observer-based residual generator based on a numerically reliable data equation obtained by filtering and sampling the input and output signals.

  19. Landmark-based model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk; Broemme, A.; Busch, C.

    2013-01-01

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence potentially useful video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm b

  20. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    OpenAIRE

    Skatova, Anya; Chan, Patricia A.; Daw, Nathaniel D.

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs....

  1. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    OpenAIRE

    Anya eSkatova; Patricia Angie Chan; Daw, Nathaniel D.

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, ver...

  2. Model-free functional MRI analysis using improved fuzzy cluster analysis techniques

    Science.gov (United States)

    Lange, Oliver; Meyer-Baese, Anke; Wismueller, Axel; Hurdal, Monica; Sumners, DeWitt; Auer, Dorothee

    2004-04-01

    Conventional model-based or statistical analysis methods for functional MRI (fMRI) are easy to implement, and are effective in analyzing data with simple paradigms. However, they are not applicable in situations in which patterns of neural response are complicated and when fMRI response is unknown. In this paper the Gath-Geva algorithm is adapted and rigorously studied for analyzing fMRI data. The algorithm supports spatial connectivity aiding in the identification of activation sites in functional brain imaging. A comparison of this new method with the fuzzy n-means algorithm, Kohonen's self-organizing map, fuzzy n-means algorithm with unsupervised initialization, minimal free energy vector quantizer and the "neural gas" network is done in a systematic fMRI study showing comparative quantitative evaluations. The most important findings in the paper are: (1) the Gath-Geva algorithms outperforms for a large number of codebook vectors all other clustering methods in terms of detecting small activation areas, and (2) for a smaller number of codebook vectors the fuzzy n-means with unsupervised initialization outperforms all other techniques. The applicability of the new algorithm is demonstrated on experimental data.

  3. Kinetic analysis of the polymer burnout in ceramic thermoplastic processing of the YSZ thin electrolyte structures using model free method

    OpenAIRE

    Salehi, Mehdi; Clemens, Frank; Graule, Thomas; Grobéty, Bernard

    2012-01-01

    Polymeric binder burnout during thermoplastic processing of yttria stabilized zirconia (YSZ) ceramics were analyzed using thermogravimetric analysis (TGA). The debinding kinetics of the stearic acid/polystyrene binder have been described using model free methods and compared with the decomposition rate of the pure polymers. The apparent activation energy Eα as a function of debinding progress α was calculated in two atmospheres (argon and air) by three different methods: Ozawa–Flynn–Wall (OFW...

  4. Thermokinetics analysis of biomass based on model-free different heating rate method%基于多升温速率法的典型生物质热动力学分析

    Institute of Scientific and Technical Information of China (English)

    田宜水; 王茹

    2016-01-01

    为研究典型生物质热动力学,判断反应机理,获得反应的动力学速率参数,该文采用热重分析技术对玉米秸秆、小麦秸秆、棉秆、松树木屑、花生壳、甜高粱渣等生物质原料进行了氮气气氛下不同升温速率的热解特性试验研究,利用Friedman法、Flynn-Wall-Ozawa法计算活化能,用Malek法确定最概然机理函数,建立了生物质热分析动力学模型,并讨论了不同生物质的差异性。结果表明:生物质的热解过程均包括3个主要阶段:干燥预热阶段、挥发分析出阶段、碳化阶段。典型生物质活化能随着转化率的增加而增加,在挥发分析出阶段,热解活化能介于144.61~167.34 kJ/mol之间;反应动力学机理均符合Avrami-Erofeev函数,但反应级数有一定的差异;指前因子介于26.66~33.97 s-1之间。这为生物质热化学转化过程工艺条件的优化及工程放大提供理论依据。%Thermokinetics analysis can test the relationship between physical and chemical properties of material and temperature through controlling heating rate. Through thermokinetics analysis, we can study the combustion, pyrolysis and gasification reaction kinetics of biomass, decide the reaction kinetics model and calculate the reaction kinetics parameters, such as activation energy and pre-exponential factor. In the article, we chose 6 kinds of biomass raw materials, including corn straw, wheat straw, cotton stalk, pine sawdust, peanut shell, and residue of sweet sorghum. The thermal gravity analysis (TG) experiments were carried out, and 8 loss curves were obtained under non-isothermal conditions at linear heating rate of 5, 10, 20 and 30℃/min. The 99.99% nitrogen continuously passed and the temperature rose from room temperature to 600℃. The initial sample weight was always within the range of 3-4 mg. The method of different heating rates was applied to non-isothermal data. The Friedman method and the

  5. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  6. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made. DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  7. Model-free analysis of quadruply imaged gravitationally lensed systems and substructured galaxies

    CERN Document Server

    Woldesenbet, Addishiwot Girma

    2015-01-01

    Multiple image gravitational lens systems, and especially quads are invaluable in determining the amount and distribution of mass in galaxies. This is usually done by mass modeling using parametric or free-form methods. An alternative way of extracting information about lens mass distribution is to use lensing degeneracies and invariants. Where applicable, they allow one to make conclusions about whole classes of lenses without model fitting. Here, we use approximate, but observationally useful invariants formed by the three relative polar angles of quad images around the lens center to show that many smooth elliptical+shear lenses can reproduce the same set of quad image angles within observational error. This result allows us to show in a model-free way what the general class of smooth elliptical+shear lenses looks like in the three dimensional (3D) space of image relative angles, and that this distribution does not match that of the observed quads. We conclude that, even though smooth elliptical+shear lens...

  8. Vision-Based Autonomous Underwater Vehicle Navigation in Poor Visibility Conditions Using a Model-Free Robust Control

    Directory of Open Access Journals (Sweden)

    Ricardo Pérez-Alcocer

    2016-01-01

    Full Text Available This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem, we have adapted the lαβ color space to identify features of interest in underwater images even in extreme visibility conditions. To guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our visual navigation system in real environments showing the feasibility of our approach.

  9. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    OpenAIRE

    Zhe Dong

    2016-01-01

    The modular high temperature gas-cooled reactor (MHTGR) is a typical small modular reactor (SMR) that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the mult...

  10. Model-Free Visualization of Suspicious Lesions in Breast MRI Based on Supervised and Unsupervised Learning.

    Science.gov (United States)

    Twellmann, Thorsten; Meyer-Baese, Anke; Lange, Oliver; Foo, Simon; Nattkemper, Tim W

    2008-03-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) has become an important tool in breast cancer diagnosis, but evaluation of multitemporal 3D image data holds new challenges for human observers. To aid the image analysis process, we apply supervised and unsupervised pattern recognition techniques for computing enhanced visualizations of suspicious lesions in breast MRI data. These techniques represent an important component of future sophisticated computer-aided diagnosis (CAD) systems and support the visual exploration of spatial and temporal features of DCE-MRI data stemming from patients with confirmed lesion diagnosis. By taking into account the heterogeneity of cancerous tissue, these techniques reveal signals with malignant, benign and normal kinetics. They also provide a regional subclassification of pathological breast tissue, which is the basis for pseudo-color presentations of the image data. Intelligent medical systems are expected to have substantial implications in healthcare politics by contributing to the diagnosis of indeterminate breast lesions by non-invasive imaging. PMID:19255616

  11. Model-free fuzzy control of a magnetorheological elastomer vibration isolation system: analysis and experimental evaluation

    Science.gov (United States)

    Fu, Jie; Li, Peidong; Wang, Yuan; Liao, Guanyao; Yu, Miao

    2016-03-01

    This paper addresses the problem of micro-vibration control of a precision vibration isolation system with a magnetorheological elastomer (MRE) isolator and fuzzy control strategy. Firstly, a polyurethane matrix MRE isolator working in the shear-compression mixed mode is introduced. The dynamic characteristic is experimentally tested, and the range of the frequency shift and the model parameters of the MRE isolator are obtained from experimental results. Secondly, a new semi-active control law is proposed, which uses isolation structure displacement and relative displacement between the isolation structure and base as the inputs. Considering the nonlinearity of the MRE isolator and the excitation uncertainty of an isolation system, the designed semi-active fuzzy logic controller (FLC) is independent of a system model and is robust. Finally, the numerical simulations and experiments are conducted to evaluate the performance of the FLC with single-frequency and multiple-frequency excitation, respectively, and the experimental results show that the acceleration transmissibility is reduced by 54.04% at most, which verifies the effectiveness of the designed semi-active FLC. Moreover, the advantages of the approach are demonstrated in comparison to the passive control and ON-OFF control.

  12. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Directory of Open Access Journals (Sweden)

    Markus Helmer

    Full Text Available Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain. They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even

  13. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  14. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  15. Dynamics of GCN4 facilitate DNA interaction: a model-free analysis of an intrinsically disordered region.

    Science.gov (United States)

    Gill, Michelle L; Byrd, R Andrew; Palmer Iii, Arthur G

    2016-02-17

    Intrinsically disordered proteins (IDPs) and proteins with intrinsically disordered regions (IDRs) are known to play important roles in regulatory and signaling pathways. A critical aspect of these functions is the ability of IDP/IDRs to form highly specific complexes with target molecules. However, elucidation of the contributions of conformational dynamics to function has been limited by challenges associated with structural heterogeneity of IDP/IDRs. Using NMR spin relaxation parameters ((15)N R1, (15)N R2, and {(1)H}-(15)N heteronuclear NOE) collected at four static magnetic fields ranging from 14.1 to 21.1 T, we have analyzed the backbone dynamics of the basic leucine-zipper (bZip) domain of the Saccharomyces cerevisiae transcription factor GCN4, whose DNA binding domain is intrinsically disordered in the absence of DNA substrate. We demonstrate that the extended model-free analysis can be applied to proteins with IDRs such as apo GCN4 and that these results significantly extend previous NMR studies of GCN4 dynamics performed using a single static magnetic field of 11.74 T [Bracken, et al., J. Mol. Biol., 1999, 285, 2133-2146] and correlate well with molecular dynamics simulations [Robustelli, et al., J. Chem. Theory Comput., 2013, 9, 5190-5200]. In contrast to the earlier work, data at multiple static fields allows the time scales of internal dynamics of GCN4 to be reliably quantified. Large amplitude dynamic fluctuations in the DNA-binding region have correlation times (τs ≈ 1.4-2.5 ns) consistent with a two-step mechanism in which partially ordered bZip conformations of GCN4 form initial encounter complexes with DNA and then rapidly rearrange to the high affinity state with fully formed basic region recognition helices. PMID:26661739

  16. Re-evaluation of the model-free analysis of fast internal motion in proteins using NMR relaxation.

    Science.gov (United States)

    Frederick, Kendra King; Sharp, Kim A; Warischalk, Nicholas; Wand, A Joshua

    2008-09-25

    NMR spin relaxation retains a central role in the characterization of the fast internal motion of proteins and their complexes. Knowledge of the distribution and amplitude of the motion of amino acid side chains is critical for the interpretation of the dynamical proxy for the residual conformational entropy of proteins, which can potentially significantly contribute to the entropy of protein function. A popular treatment of NMR relaxation phenomena in macromolecules dissolved in liquids is the so-called model-free approach of Lipari and Szabo. The robustness of the mode-free approach has recently been strongly criticized and the remarkable range and structural context of the internal motion of proteins, characterized by such NMR relaxation techniques, attributed to artifacts arising from the model-free treatment, particularly with respect to the symmetry of the underlying motion. We develop an objective quantification of both spatial and temporal asymmetry of motion and re-examine the foundation of the model-free treatment. Concerns regarding the robustness of the model-free approach to asymmetric motion appear to be generally unwarranted. The generalized order parameter is robustly recovered. The sensitivity of the model-free treatment to asymmetric motion is restricted to the effective correlation time, which is by definition a normalized quantity and not a true time constant and therefore of much less interest in this context. With renewed confidence in the model-free approach, we then examine the microscopic distribution of side chain motion in the complex between calcium-saturated calmodulin and the calmodulin-binding domain of the endothelial nitric oxide synthase. Deuterium relaxation is used to characterize the motion of methyl groups in the complex. A remarkable range of Lipari-Szabo model-free generalized order parameters are seen with little correlation with basic structural parameters such as the depth of burial. These results are contrasted with the

  17. Respective Advantages and Disadvantages of Model-based and Model-free Reinforcement Learning in a Robotics Neuro-inspired Cognitive Architecture

    OpenAIRE

    Renaudo, Erwan; Girard, Benoît; Chatila, Raja; Khamassi, Mehdi

    2015-01-01

    International audience Combining model-based and model-free reinforcement learning systems in robotic cognitive architectures appears as a promising direction to endow artificial agents with flexibility and decisional autonomy close to mammals. In particular, it could enable robots to build an internal model of the environment, plan within it in response to detected environmental changes, and avoid the cost and time of planning when the stability of the environment is recognized as enablin...

  18. Development of model-free analysis method on quasi-elastic neutron scattering and application to liquid water

    International Nuclear Information System (INIS)

    In general, analysis of quasi-elastic neutron scattering spectra needs some mathematical models in its process, and hence the obtained result is a model dependent. Model-dependent analysis may lead misunderstandings caused by inappropriate initial models or may miss an unexpected relaxation phenomenon. We have developed an analysis method for processing QENS data without a specific model, which we call as mode-distribution analysis. In this method, we supposed that all modes can be described as combinations of the relaxations based on the exponential law. By this method, we can obtain a distribution function B(Q,Γ) which we call the mode-distribution function, to represent the number of relaxation modes and distributions of the relaxation times in the modes. We report the first application to experimental data of liquid water. In addition to the two known modes, the existence of a relaxation mode of water molecules with an intermediate time scale has been discovered. (author)

  19. Connectivity concordance mapping: a new tool for model-free analysis of fMRI data of the human brain

    Directory of Open Access Journals (Sweden)

    Gabriele eLohmann

    2012-03-01

    Full Text Available Functional magnetic resonance data acquired in a task-absent condition ("resting state'' require new data analysis techniques that do not depend on an activation model. Here, we propose a new analysis method called "Connectivity Concordance Mapping (CCM".The main idea is to assign a label to each voxel based on the reproducibility of its whole-brain pattern of connectivity. Specifically, we compute the correlations across measurements of each voxel's correlation-based functional connectivity map, resulting in a voxelwise map of concordance values. Regions of high interscan concordance can be assumed to be functionally consistent, and may thus be of specific interest for further analysis. Here we present two fMRI studies to test the algorithm. The first is a eyes open/eyes closed paradigm designed to highlight the potential of the method in a relatively simple state-dependent domain. The second study is a longitudinal repeated measurement of a patient following stroke. Longitudinal clinical studies such as this may represent the most interesting domain of applications for this algorithm, as it provides an exploratory means to identify changes in connectivity, such as those during post-stroke recovery.

  20. Model-free distributed learning

    Science.gov (United States)

    Dembo, Amir; Kailath, Thomas

    1990-01-01

    Model-free learning for synchronous and asynchronous quasi-static networks is presented. The network weights are continuously perturbed, while the time-varying performance index is measured and correlated with the perturbation signals; the correlation output determines the changes in the weights. The perturbation may be either via noise sources or orthogonal signals. The invariance to detailed network structure mitigates large variability between supposedly identical networks as well as implementation defects. This local, regular, and completely distributed mechanism requires no central control and involves only a few global signals. Thus it allows for integrated on-chip learning in large analog and optical networks.

  1. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. PMID:26918742

  2. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)

    2014-05-15

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  3. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    International Nuclear Information System (INIS)

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  4. Internal motions in yeast phenylalanine transfer RNA from 13C NMR relaxation rates of modified base methyl groups: a model-free approach

    International Nuclear Information System (INIS)

    Internal motions at specific locations through yeast phenylalanine tRNA were measured by using nucleic acid biosynthetically enriched in 13C at modified base methyl groups. Carbon NMR spectra of isotopically enriched tRNA/sup Phe/ reveal 12 individual peaks for 13 of the 14 methyl groups known to be present. The two methyls of N2, N2-dimethylguanosine (m22G-26) have indistinguishable resonances, whereas the fourteenth methyl bound to ring carbon-11 of the hypermodified nucleoside 3' adjacent to the anticodon, wyosine (Y-37), does not come from the [methyl-13C] methionine substrate. Assignments to individual nucleosides within the tRNA were made on the basis of chemical shifts of the mononucleosides and correlation of 13C resonances with proton NMR chemical shifts via two-dimensional heteronuclear proton-carbon correlation spectroscopy. Values of 13C longitudinal relaxation (T1) and the nuclear Overhauser enhancements (NOE) were determined at 22.5, 75.5, and 118 MHz for tRNA/sup Phe/ in a physiological buffer solution with 10 mM MgCl2, at 220C. These data were used to extract two physical parameters that define the system with regard to fast internal motion: the generalized order parameters (S2) and effective correlation times (tau/sub e/) for internal motion of the C-H internuclear vectors. For all methyl groups the generalized order parameter varied from 0.057 to 0.108, compared with the value of 0.111 predicted for a rapidly spinning methyl group rigidly mounted on a spherical macromolecule. Values of tau/sub e/ ranged from 4 to 16 ps, generally shorter times than measured in other work for amino acid methyl groups in several proteins. Somewhat surprising was the finding that the two methyl esters terminating the Y-37 side chain have order parameters similar to those of other methyls in tRNA and only 25% less than that for a methyl directly bonded to the base

  5. Model-free learning from demonstration

    OpenAIRE

    Billing, Erik; Hellström, Thomas; Janlert, Lars Erik

    2010-01-01

    A novel robot learning algorithm called Predictive Sequence Learning (PSL) is presented and evaluated. PSL is a model-free prediction algorithm inspired by the dynamic temporal difference algorithm S-Learning. While S-Learning has previously been applied as a reinforcement learning algorithm for robots, PSL is here applied to a Learning from Demonstration problem. The proposed algorithm is evaluated on four tasks using a Khepera II robot. PSL builds a model from demonstrated data which is use...

  6. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    Directory of Open Access Journals (Sweden)

    Xuhui Bu

    2012-01-01

    Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.

  7. A Survey on Applications of Model-Free Strategy Learning in Cognitive Wireless Networks

    OpenAIRE

    Wang, Wenbo; Kwasinski, Andres; Niyato, Dusit; Han, Zhu

    2015-01-01

    Model-free learning has been considered as an efficient tool for designing control mechanisms when the model of the system environment or the interaction between the decision-making entities is not available as a-priori knowledge. With model-free learning, the decision-making entities adapt their behaviors based on the reinforcement from their interaction with the environment and are able to (implicitly) build the understanding of the system through trial-and-error mechanisms. Such characteri...

  8. Model-free constrained data-driven iterative reference input tuning algorithm with experimental validation

    Science.gov (United States)

    Radac, Mircea-Bogdan; Precup, Radu-Emil

    2016-05-01

    This paper presents the design and experimental validation of a new model-free data-driven iterative reference input tuning (IRIT) algorithm that solves a reference trajectory tracking problem as an optimization problem with control signal saturation constraints and control signal rate constraints. The IRIT algorithm design employs an experiment-based stochastic search algorithm to use the advantages of iterative learning control. The experimental results validate the IRIT algorithm applied to a non-linear aerodynamic position control system. The results prove that the IRIT algorithm offers the significant control system performance improvement by few iterations and experiments conducted on the real-world process and model-free parameter tuning.

  9. Determination of pyrolysis characteristics and kinetics of palm kernel shell using TGA–FTIR and model-free integral methods

    International Nuclear Information System (INIS)

    Highlights: • Model-free integral kinetics method and analytical TGA–FTIR were conducted on pyrolysis process of PKS. • The pyrolysis mechanism of PKS was elaborated. • Thermal stability was established: lignin > cellulose > xylan. • Detailed compositions in the volatiles of PKS pyrolysis were determinated. • The interaction of biomass three components led to the fluctuation of activation energy in PKS pyrolysis. - Abstract: Palm kernel shell (PKS) from palm oil production is a potential biomass source for bio-energy production. A fundamental understanding of PKS pyrolysis behavior and kinetics is essential to its efficient thermochemical conversion. The thermal degradation profile in derivative thermogravimetry (DTG) analysis shown two significant mass-loss peaks mainly related to the decomposition of hemicellulose and cellulose respectively. This characteristic differentiated with other biomass (e.g. wheat straw and corn stover) presented just one peak or accompanied with an extra “shoulder” peak (e.g. wheat straw). According to the Fourier transform infrared spectrometry (FTIR) analysis, the prominent volatile components generated by the pyrolysis of PKS were CO2 (2400–2250 cm−1 and 586–726 cm−1), aldehydes, ketones, organic acids (1900–1650 cm−1), and alkanes, phenols (1475–1000 cm−1). The activation energy dependent on the conversion rate was estimated by two model-free integral methods: Flynn–Wall–Ozawa (FWO) and Kissinger–Akahira–Sunose (KAS) method at different heating rates. The fluctuation of activation energy can be interpreted as a result of interactive reactions related to cellulose, hemicellulose and lignin degradation, occurred in the pyrolysis process. Based on TGA–FTIR analysis and model free integral kinetics method, the pyrolysis mechanism of PKS was elaborated in this paper

  10. Model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk

    2013-01-01

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence much video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm based on 2D lan

  11. Totally Model-Free Learned Skillful Coping

    Science.gov (United States)

    Dreyfus, Stuart E.

    2004-01-01

    The author proposes a neural-network-based explanation of how a brain might acquire intuitive expertise. The explanation is intended merely to be suggestive and lacks many complexities found in even lower animal brains. Yet significantly, even this simplified brain model is capable of explaining the acquisition of simple skills without developing…

  12. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most of...

  13. A novel model-free approach for reconstruction of time-delayed gene regulatory networks

    Institute of Scientific and Technical Information of China (English)

    JIANG; Wei; LI; Xia; GUO; Zheng; LI; Chuanxing; WANG; Lihong

    2006-01-01

    Reconstruction of genetic networks is one of the key scientific challenges in functional genomics. This paper describes a novel approach for addressing the regulatory dependencies between genes whose activities can be delayed by multiple units of time. The aim of the proposed approach termed TdGRN (time-delayed gene regulatory networking) is to reversely engineer the dynamic mechanisms of gene regulations, which is realized by identifying the time-delayed gene regulations through supervised decision-tree analysis of the newly designed time-delayed gene expression matrix, derived from the original time-series microarray data. A permutation technique is used to determine the statistical classification threshold of a tree, from which a gene regulatory rule(s) is extracted. The proposed TdGRN is a model-free approach that attempts to learn the underlying regulatory rules without relying on any model assumptions. Compared with model-based approaches, it has several significant advantages: it requires neither any arbitrary threshold for discretization of gene transcriptional values nor the definition of the number of regulators (k). We have applied this novel method to the publicly available data for budding yeast cell cycling. The numerical results demonstrate that most of the identified time-delayed gene regulations have current biological knowledge supports.

  14. Model-free adaptive control of advanced power plants

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  15. A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series

    Institute of Scientific and Technical Information of China (English)

    孙青华; 张世英; 梁雄健

    2003-01-01

    In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.

  16. Policy improvement by a model-free Dyna architecture.

    Science.gov (United States)

    Hwang, Kao-Shing; Lo, Chia-Yue

    2013-05-01

    The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate. PMID:24808427

  17. A generalized mean-squared displacement from inelastic fixed window scans of incoherent neutron scattering as a model-free indicator of anomalous diffusion confinement

    International Nuclear Information System (INIS)

    Elastic fixed window scans of incoherent neutron scattering are an established and frequently employed method to study dynamical changes, usually over a broad temperature range or during a process such as a conformational change in the sample. In particular, the apparent mean-squared displacement can be extracted via a model-free analysis based on a solid physical interpretation as an effective amplitude of molecular motions. Here, we provide a new account of elastic and inelastic fixed window scans, defining a generalized mean-squared displacement for all fixed energy transfers. We show that this generalized mean-squared displacement in principle contains all information on the real mean-square displacement accessible in the instrumental time window. The derived formula provides a clear understanding of the effects of instrumental resolution on the apparent mean-squared displacement. Finally, we show that the generalized mean-square displacement can be used as a model-free indicator on confinement effects within the instrumental time window. (authors)

  18. Optimisation of NMR dynamic models I. Minimisation algorithms and their performance within the model-free and Brownian rotational diffusion spaces

    International Nuclear Information System (INIS)

    The key to obtaining the model-free description of the dynamics of a macromolecule is the optimisation of the model-free and Brownian rotational diffusion parameters using the collected R1, R2 and steady-state NOE relaxation data. The problem of optimising the chi-squared value is often assumed to be trivial, however, the long chain of dependencies required for its calculation complicates the model-free chi-squared space. Convolutions are induced by the Lorentzian form of the spectral density functions, the linear recombinations of certain spectral density values to obtain the relaxation rates, the calculation of the NOE using the ratio of two of these rates, and finally the quadratic form of the chi-squared equation itself. Two major topological features of the model-free space complicate optimisation. The first is a long, shallow valley which commences at infinite correlation times and gradually approaches the minimum. The most severe convolution occurs for motions on two timescales in which the minimum is often located at the end of a long, deep, curved tunnel or multidimensional valley through the space. A large number of optimisation algorithms will be investigated and their performance compared to determine which techniques are suitable for use in model-free analysis. Local optimisation algorithms will be shown to be sufficient for minimisation not only within the model-free space but also for the minimisation of the Brownian rotational diffusion tensor. In addition the performance of the programs Modelfree and Dasha are investigated. A number of model-free optimisation failures were identified: the inability to slide along the limits, the singular matrix failure of the Levenberg-Marquardt minimisation algorithm, the low precision of both programs, and a bug in Modelfree. Significantly, the singular matrix failure of the Levenberg-Marquardt algorithm occurs when internal correlation times are undefined and is greatly amplified in model-free analysis by both

  19. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  20. A model-free, fully automated baseline-removal method for Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Foist, Rod B; Okuda, Kadek; Ivanov, André; Turner, Robin F B

    2011-01-01

    We present here a fully automated spectral baseline-removal procedure. The method uses a large-window moving average to estimate the baseline; thus, it is a model-free approach with a peak-stripping method to remove spectral peaks. After processing, the baseline-corrected spectrum should yield a flat baseline and this endpoint can be verified with the χ(2)-statistic. The approach provides for multiple passes or iterations, based on a given χ(2)-statistic for convergence. If the baseline is acceptably flat given the χ(2)-statistic after the first pass at correction, the problem is solved. If not, the non-flat baseline (i.e., after the first effort or first pass at correction) should provide an indication of where the first pass caused too much or too little baseline to be subtracted. The second pass thus permits one to compensate for the errors incurred on the first pass. Thus, one can use a very large window so as to avoid affecting spectral peaks--even if the window is so large that the baseline is inaccurately removed--because baseline-correction errors can be assessed and compensated for on subsequent passes. We start with the largest possible window and gradually reduce it until acceptable baseline correction based on the χ(2) statistic is achieved. Results, obtained on both simulated and measured Raman data, are presented and discussed. PMID:21211157

  1. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  2. High-Frequency and Model-Free Volatility Estimators

    OpenAIRE

    Robert Ślepaczuk; Grzegorz Zakrzewski

    2009-01-01

    This paper focuses on volatility of financial markets, which is one of the most important issues in finance, especially with regard to modeling high-frequency data. Risk management, asset pricing and option valuation techniques are the areas where the concept of volatility estimators (consistent, unbiased and the most efficient) is of crucial concern. Our intention was to find the best estimator of true volatility taking into account the latest investigations in finance literature. Basing on ...

  3. Isothermal Kinetics of the Pentlandite Exsolution from mss/Pyrrhotite Using Model-Free Method

    Institute of Scientific and Technical Information of China (English)

    WANG Haipeng

    2006-01-01

    The pentlandite exsolution from monosulfide solid solution (mss)/pyrrhotite exsolution is a complex multi-step process, including nucleation, new phase growth and atomic diffusion, and lamellae coarsening.Some of these steps occur in sequence, others simultaneously. These make its kinetic analysis difficult, as the mechanisms cannot be elucidated in detail. In mineral reactions of this type, the true functional form of the reaction model is almost never known, and the Arrhenius parameters determined by the classic Avrami method are skewed to compensate for errors in the model. The model-free kinetics allows a universal determination of activation energy. Kinetic study of pentlandite exsolution from mss/pyrrhotite was performed over the temperature range 200 to 300℃. For mss/pyrrhotite with bulk composition (Fe0.77Ni0.19)S, activation during the course of solid reaction with the extent of reaction. The surrounding environment of reactant atoms affects the atom's activity and more or less accounts for changes of activation energy Ea.

  4. Model-Free Adaptive Fuzzy Sliding Mode Controller Optimized by Particle Swarm for Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Amin Jalali

    2013-05-01

    Full Text Available The main purpose of this paper is to design a suitable control scheme that confronts the uncertainties in a robot. Sliding mode controller (SMC is one of the most important and powerful nonlinear robust controllers which has been applied to many non-linear systems. However, this controller has some intrinsic drawbacks, namely, the chattering phenomenon, equivalent dynamic formulation, and sensitivity to the noise. This paper focuses on applying artificial intelligence integrated with the sliding mode control theory. Proposed adaptive fuzzy sliding mode controller optimized by Particle swarm algorithm (AFSMC-PSO is a Mamdani’s error based fuzzy logic controller (FLS with 7 rules integrated with sliding mode framework to provide the adaptation in order to eliminate the high frequency oscillation (chattering and adjust the linear sliding surface slope in presence of many different disturbances and the best coefficients for the sliding surface were found by offline tuning Particle Swarm Optimization (PSO. Utilizing another fuzzy logic controller as an impressive manner to replace it with the equivalent dynamic part is the main goal to make the model free controller which compensate the unknown system dynamics parameters and obtain the desired control performance without exact information about the mathematical formulation of model.

  5. ANALYSIS-BASED SPARSE RECONSTRUCTION WITH SYNTHESIS-BASED SOLVERS

    OpenAIRE

    Cleju, Nicolae; Jafari, Maria,; Plumbley, Mark D.

    2012-01-01

    Analysis based reconstruction has recently been introduced as an alternative to the well-known synthesis sparsity model used in a variety of signal processing areas. In this paper we convert the analysis exact-sparse reconstruction problem to an equivalent synthesis recovery problem with a set of additional constraints. We are therefore able to use existing synthesis-based algorithms for analysis-based exact-sparse recovery. We call this the Analysis-By-Synthesis (ABS) approach. We evaluate o...

  6. A generic model-free approach for lithium-ion battery health management

    International Nuclear Information System (INIS)

    Highlights: • A new ANN based battery model is developed and integrated with the Kalman filtering technique for battery health management. • The developed ANN based model can be updated along with the Kalman filtering process at the battery operating stage. • The developed model is adaptive and eliminates the dependency of expensive empirical battery models. • The developed approach enables accurate estimations of both short term SoC and long term capacity. • Experimental results demonstrated the efficacy of the developed battery health state estimation approach. - Abstract: Accurate estimation of the state-of-charge (SoC) and state-of-health (SoH) for an operating battery system, as a critical task for battery health management, greatly depends on the validity and generalizability of battery models. Due to the variability and uncertainties involved in battery design, manufacturing and operation, developing a generally applicable battery model remains as a grand challenge for battery health management. To eliminate the dependency of SoC and SoH estimation on battery physical models, this paper presents a generic data-driven approach that integrates an artificial neural network with a dual extended Kalman filter (DEKF) algorithm for lithium-ion battery health management. The artificial neural network is first trained offline to model the battery terminal voltages and the DEKF algorithm can then be employed online for SoC and SoH estimation, where voltage outputs from the trained artificial neural network model are used in DEKF state–space equations to replace the required battery models. The trained neural network model can be adaptively updated to account for the battery to battery variability, thus ensuring good SoC and SoH estimation accuracy. Experimental results are used to demonstrate the effectiveness of the developed model-free approach for battery health management

  7. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  8. Feature-based sentiment analysis with ontologies

    OpenAIRE

    Taner, Berk

    2011-01-01

    Sentiment analysis is a topic that many researchers work on. In recent years, new research directions under sentiment analysis appeared. Feature-based sentiment analysis is one such topic that deals not only with finding sentiment in a sentence but providing a more detailed analysis on a given domain. In the beginning researchers focused on commercial products and manually generated list of features for a product. Then they tried to generate a feature-based approach to attach sentiments to th...

  9. Model-free adaptive control optimization using a chaotic particle swarm approach

    International Nuclear Information System (INIS)

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with CPSOH

  10. A Dual Model-Free Control of Underactuated Mechanical Systems, Application to The Inertia Wheel Inverted Pendulum

    OpenAIRE

    Andary, Sébastien; Chemori, Ahmed; Benoit, Michel; Sallantin, Jean

    2012-01-01

    This paper deals with a new method allowing recent model-free control technique to deal with underactuated mechanical systems for stable limit cycles generation. A model-free controller is designed in order to track some parametrized reference trajectories. A second model-free controller is then designed using trajectories' parameters as control inputs in order to stabilize the internal dynamics. The proposed method is applied to a real underactuated mechanical system: the inertia wheel inver...

  11. Pareto analysis based on records

    CERN Document Server

    Doostparast, M

    2012-01-01

    Estimation of the parameters of an exponential distribution based on record data has been treated by Samaniego and Whitaker (1986) and Doostparast (2009). Recently, Doostparast and Balakrishnan (2011) obtained optimal confidence intervals as well as uniformly most powerful tests for one- and two-sided hypotheses concerning location and scale parameters based on record data from a two-parameter exponential model. In this paper, we derive optimal statistical procedures including point and interval estimation as well as most powerful tests based on record data from a two-parameter Pareto model. For illustrative purpose, a data set on annual wages of a sample production-line workers in a large industrial firm is analyzed using the proposed procedures.

  12. TEXTURE ANALYSIS BASED IRIS RECOGNITION

    OpenAIRE

    GÜRKAN, Güray; AKAN, Aydın

    2012-01-01

    In this paper, we present a new method for personal identification, based on iris patterns. The method composed of iris image acquisition, image preprocessing, feature extraction and finally decision stages. Normalized iris images are vertically log-sampled and filtered by circular symmetric Gabor filters. The output of filters are windowed and mean absolute deviation of pixels in the window are calculated as the feature vectors. The proposed  method has the desired properties of an iris reco...

  13. ROAn, a ROOT based Analysis Framework

    CERN Document Server

    Lauf, Thomas

    2013-01-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and...

  14. Excel-Based Business Analysis

    CERN Document Server

    Anari, Ali

    2012-01-01

    ai"The trend is your friend"is a practical principle often used by business managers, who seek to forecast future sales, expenditures, and profitability in order to make production and other operational decisions. The problem is how best to identify and discover business trends and utilize trend information for attaining objectives of firms.This book contains an Excel-based solution to this problem, applying principles of the authors' "profit system model" of the firm that enables forecasts of trends in sales, expenditures, profits and other business variables. The program,

  15. Statistical analysis of life history calendar data

    OpenAIRE

    Eerola, Mervi; Helske, Satu

    2016-01-01

    The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and cont...

  16. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  17. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  18. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  19. Curvelet Based Offline Analysis of SEM Images

    OpenAIRE

    Shirazi, Syed Hamad; Haq, Nuhman ul; Hayat, Khizar; Naz, Saeeda; Haque, Ihsan ul

    2014-01-01

    Manual offline analysis, of a scanning electron microscopy (SEM) image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method ...

  20. Analysis of a Chaotic Memristor Based Oscillator

    Directory of Open Access Journals (Sweden)

    F. Setoudeh

    2014-01-01

    Full Text Available A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT. Then, using Advance Design System (ADS software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of the paper.

  1. Analysis of a Chaotic Memristor Based Oscillator

    OpenAIRE

    F. Setoudeh; Khaki Sedigh, A.; Dousti, M

    2014-01-01

    A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT). Then, using Advance Design System (ADS) software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of t...

  2. A Model-Free No-arbitrage Price Bound for Variance Options

    Energy Technology Data Exchange (ETDEWEB)

    Bonnans, J. Frederic, E-mail: frederic.bonnans@inria.fr [Ecole Polytechnique, INRIA-Saclay (France); Tan Xiaolu, E-mail: xiaolu.tan@polytechnique.edu [Ecole Polytechnique, CMAP (France)

    2013-08-01

    We suggest a numerical approximation for an optimization problem, motivated by its applications in finance to find the model-free no-arbitrage bound of variance options given the marginal distributions of the underlying asset. A first approximation restricts the computation to a bounded domain. Then we propose a gradient projection algorithm together with the finite difference scheme to solve the optimization problem. We prove the general convergence, and derive some convergence rate estimates. Finally, we give some numerical examples to test the efficiency of the algorithm.

  3. Improved Frechet bounds and model-free pricing of multi-asset options

    CERN Document Server

    Tankov, Peter

    2010-01-01

    We compute the improved bounds on the copula of a bivariate random vector when partial information is available, such as the values of the copula on the subset of $[0,1]^2$, or the value of a functional of the copula, monotone with respect to the concordance order. These results are then used to compute model-free bounds on the prices of two-asset options which make use of extra information about the dependence structure, such as the price of another two-asset option.

  4. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty

    2011-10-01

    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  5. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge Emil Borch Laurs; Nielsen, Mads; Lo, Pechin Chien Pau;

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based on...... subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density. The...

  6. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  7. Polyphase Order Analysis Based on Convolutional Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-06-01

    Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.

  8. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng

    2006-01-01

    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  9. Social Network Analysis Based on Network Motifs

    OpenAIRE

    Xu Hong-lin; Yan Han-bing; Gao Cui-fang; Zhu Ping

    2014-01-01

    Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...

  10. Swarm Intelligence Based Algorithms: A Critical Analysis

    OpenAIRE

    Yang, Xin-She

    2014-01-01

    Many optimization algorithms have been developed by drawing inspiration from swarm intelligence (SI). These SI-based algorithms can have some advantages over traditional algorithms. In this paper, we carry out a critical analysis of these SI-based algorithms by analyzing their ways to mimic evolutionary operators. We also analyze the ways of achieving exploration and exploitation in algorithms by using mutation, crossover and selection. In addition, we also look at algorithms using dynamic sy...

  11. What Now? Some Brief Reflections on Model-Free Data Analysis

    OpenAIRE

    Richard Berk

    2009-01-01

    David Freedman’s critique of causal modeling in the social and biomedical sciences was fundamental. In his view, the enterprise was misguided, and there was no technical fix. Far too often, there was a disconnect between what the statistical methods required and the substantive information that could be brought to bear. In this paper, I briefly consider some alternatives to causal modeling assuming that David Freedman’s perspective on modeling is correct. In addition to randomized experiments...

  12. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  13. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang

    2004-01-01

    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  14. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;

    2014-01-01

    with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...

  15. Modeling Mass Spectrometry Based Protein Analysis

    OpenAIRE

    Eriksson, Jan; Fenyö, David

    2011-01-01

    The success of mass spectrometry based proteomics depends on efficient methods for data analysis. These methods require a detailed understanding of the information value of the data. Here, we describe how the information value can be elucidated by performing simulations using synthetic data.

  16. Data-Driven Control for Interlinked AC/DC Microgrids via Model-Free Adaptive Control and Dual-Droop Control

    DEFF Research Database (Denmark)

    Zhang, Huaguang; Zhou, Jianguo; Sun, Qiuye;

    2016-01-01

    terminal voltage and ac frequency. Moreover, the design of the controller is only based on input/output (I/O) measurement data but not the model any more, and the system stability can be guaranteed by the Lyapunov method. The detailed system architecture and proposed control strategies are presented in......: the primary outer-loop dual-droop control method along with secondary control; the inner-loop data-driven model-free adaptive voltage control. Using the proposed scheme, the interlinking converter, just like the hierarchical controlled DG units, will have the ability to regulate and restore the dc...

  17. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...... analysis of variables with different genesis and therefore different statistical distributions and different modalities. As a proof of concept we give a toy example. We also give an example with one (weather radar based) variable in the one set and eight spectral bands of optical satellite data in the...

  18. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  19. TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING

    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie

    2003-01-01

    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  20. Structure-based analysis of Web sites

    OpenAIRE

    Yen, B

    2004-01-01

    The performance of information retrieval on the Web is heavily influenced by the organization of Web pages, user navigation patterns, and guidance-related functions. Having observed the lack of measures to reflect this factor, this paper focuses on an approach based on both structure properties and navigation data to analyze and improve the performance of Web site. Two types of indices are defined two major factors for analysis and improvement- "aaccessibility" reflects the structure property...

  1. Quantum entanglement analysis based on abstract interpretation

    OpenAIRE

    Perdrix, Simon

    2008-01-01

    Entanglement is a non local property of quantum states which has no classical counterpart and plays a decisive role in quantum information theory. Several protocols, like the teleportation, are based on quantum entangled states. Moreover, any quantum algorithm which does not create entanglement can be efficiently simulated on a classical computer. The exact role of the entanglement is nevertheless not well understood. Since an exact analysis of entanglement evolution induces an exponential sl...

  2. Particle Pollution Estimation Based on Image Analysis

    OpenAIRE

    Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian

    2016-01-01

    Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic...

  3. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  4. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  5. Key Point Based Data Analysis Technique

    Science.gov (United States)

    Yang, Su; Zhang, Yong

    In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.

  6. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.;

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis......This review with 239 references has as its aim to give the reader an introduction to the kinds of methods used for developing microchip based electrode systems as well as to cover the existing literature on electroanalytical systems where microchips play a crucial role for 'nondestructive...... studies and scanning electrochemical microscopy (SECM) studies of living cells have been omitted. Included is also a discussion about some future and emerging nano tools and considerations that might have an impact on the future of "nondestructive" chip based electroanalysis of living cells....

  7. Fault-based analysis of flexible ciphers

    Directory of Open Access Journals (Sweden)

    V.I.Korjik

    2002-07-01

    Full Text Available We consider security of some flexible ciphers against differential fault analysis (DFA. We present a description of the fault-based attack on two kinds of the flexible ciphers. The first kind is represented by the fast software-oriented cipher based on data-dependent subkey selection (DDSS, in which flexibility corresponds to the use of key-dependent operations. The second kind is represented by a DES-like cryptosystem GOST with secrete S-boxes. In general, the use of some secrete operations and procedures contributes to the security of the cryptosystem, however degree of this contribution depends significantly on the structure of the encryption mechanism. It is shown how to attack the DDSS-based flexible cipher using DFA though this cipher is secure against standard variants of the differential and linear cryptanalysis. We also give an outline of ciphers RC5 and GOST showing that they are also insecure against DFA-based attack. We suggest also a modification of the DDSS mechanism and a variant of the advanced DDSS-based flexible cipher that is secure against attacks based on random hardware faults.

  8. System based practice: a concept analysis

    Science.gov (United States)

    YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN

    2016-01-01

    Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198

  9. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  10. Gait correlation analysis based human identification.

    Science.gov (United States)

    Chen, Jinyan

    2014-01-01

    Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x), vertical axis (y), and temporal axis (t). By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features' dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance. PMID:24592144

  11. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  12. Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis

    Science.gov (United States)

    Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo

    2002-03-01

    In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.

  13. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  14. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.

    2016-01-01

    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  15. Arabic Interface Analysis Based on Cultural Markers

    Directory of Open Access Journals (Sweden)

    Mohammadi Akheela Khanum

    2012-01-01

    Full Text Available This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstedes cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstedes score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  16. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  17. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A

    2012-01-01

    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  18. Video semantic content analysis based on ontology

    OpenAIRE

    Bai, Liang; Lao, Songyang; Jones, Gareth J.F.; Smeaton, Alan F.

    2007-01-01

    The rapid increase in the available amount of video data is creating a growing demand for efficient methods for understanding and managing it at the semantic level. New multimedia standards, such as MPEG-4 and MPEG-7, provide the basic functionalities in order to manipulate and transmit objects and metadata. But importantly, most of the content of video data at a semantic level is out of the scope of the standards. In this paper, a video semantic content analysis framework based on ontology i...

  19. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  20. Kinetic modelling of RDF pyrolysis: Model-fitting and model-free approaches.

    Science.gov (United States)

    Çepelioğullar, Özge; Haykırı-Açma, Hanzade; Yaman, Serdar

    2016-02-01

    In this study, refuse derived fuel (RDF) was selected as solid fuel and it was pyrolyzed in a thermal analyzer from room temperature to 900°C at heating rates of 5, 10, 20, and 50°C/min in N2 atmosphere. The obtained thermal data was used to calculate the kinetic parameters using Coats-Redfern, Friedman, Flylnn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) methods. As a result of Coats-Redfern model, decomposition process was assumed to be four independent reactions with different reaction orders. On the other hand, model free methods demonstrated that activation energy trend had similarities for the reaction progresses of 0.1, 0.2-0.7 and 0.8-0.9. The average activation energies were found between 73-161kJ/mol and it is possible to say that FWO and KAS models produced closer results to the average activation energies compared to Friedman model. Experimental studies showed that RDF may be a sustainable and promising feedstock for alternative processes in terms of waste management strategies. PMID:26613830

  1. Stellar loci II. a model-free estimate of the binary fraction for field FGK stars

    CERN Document Server

    Yuan, Haibo; Xiang, Maosheng; Huang, Yang; Chen, Bingqiu

    2014-01-01

    We propose a Stellar Locus OuTlier (SLOT) method to determine the binary fraction of main-sequence stars statistically. The method is sensitive to neither the period nor mass-ratio distributions of binaries, and able to provide model-free estimates of binary fraction for large numbers of stars of different populations in large survey volumes. We have applied the SLOT method to two samples of stars from the SDSS Stripe 82, constructed by combining the re-calibrated SDSS photometric data with respectively the spectroscopic information from the SDSS and LAMOST surveys. For the SDSS spectroscopic sample, we find an average binary fraction for field FGK stars of $41%\\pm2%$. The fractions decrease toward late spectral types, and are respectively $44%\\pm5%$, $43%\\pm3%$, $35%\\pm5%$, and $28%\\pm6%$ for stars of $g-i$ colors between 0.3 -- 0.6, 0.6 -- 0.9, 0.9 -- 1.2, and 1.2 - 1.6\\,mag. A modest metallicity dependence is also found. The fraction decreases with increasing metallicity. For stars of [Fe/H] between $-0.5$...

  2. A model-free method for annotating on vascular structure in volume rendered images

    Science.gov (United States)

    He, Wei; Li, Yanfang; Shi, Weili; Miao, Yu; He, Fei; Yan, Fei; Yang, Huamin; Zhang, Huimao; Mori, Kensaku; Jiang, Zhengang

    2015-03-01

    The precise annotation of vessel is desired in computer-assisted systems to help surgeons identify each vessel branch. A method has been reported that annotates vessels on volume rendered images by rendering their names on them using a two-pass rendering process. In the reported method, however, cylinder surface models of the vessels should be generated for writing vessels names. In fact, vessels are not actual cylinders, so the surfaces of the vessels cannot be simulated by such models accurately. This paper presents a model-free method for annotating vessels on volume rendered images by rendering their names on them using the two-pass rendering process: surface rendering and volume rendering. In the surface rendering process, docking points of vessel names are estimated by using such properties as centerlines, running directions, and vessel regions which are obtained in preprocess. Then the vessel names are pasted on the vessel surfaces at the docking points. In the volume rendering process, volume image is rendered using a fast volume rendering algorithm with depth buffer of image rendered in the surface rendering process. Finally, those rendered images are blended into an image as a result. In order to confirm the proposed method, a visualizing system for the automated annotation of abdominal arteries is performed. The experimental results show that vessel names can be drawn on the corresponding vessel in the volume rendered images correctly. The proposed method has enormous potential to be adopted to annotate other organs which cannot be modeled using regular geometrical surface.

  3. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  4. Curvelet based offline analysis of SEM images.

    Directory of Open Access Journals (Sweden)

    Syed Hamad Shirazi

    Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.

  5. Visual Similarity Based Document Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    Di Wen; Xiao-Qing Ding

    2006-01-01

    In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.

  6. Watermark Resistance Analysis Based On Linear Transformation

    Directory of Open Access Journals (Sweden)

    N.Karthika Devi

    2012-06-01

    Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.

  7. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  8. Cognitive fusion analysis based on context

    Science.gov (United States)

    Blasch, Erik P.; Plano, Susan

    2004-04-01

    The standard fusion model includes active and passive user interaction in level 5 - "User Refinement". User refinement is more than just details of passive automation partitioning - it is the active management of information. While a fusion system can explore many operational conditions over myopic changes, the user has the ability to reason about the hyperopic "big picture." Blasch and Plano developed cognitive-fusion models that address user constraints including: intent, attention, trust, workload, and throughput to facilitate hyperopic analysis. To enhance user-fusion performance modeling (i.e. confidence, timeliness, and accuracy); we seek to explore the nature of context. Context, the interrelated conditions of which something exists, can be modeled in many ways including geographic, sensor, object, and environmental conditioning. This paper highlights user refinement actions based on context to constrain the fusion analysis for accurately representing the trade space in the real world. As an example, we explore a target identification task in which contextual information from the user"s cognitive model is imparted to a fusion belief filter.

  9. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  10. Operating cost analysis of anaesthesia: Activity based costing (ABC analysis

    Directory of Open Access Journals (Sweden)

    Majstorović Branislava M.

    2011-01-01

    Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  11. Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals.

    Directory of Open Access Journals (Sweden)

    Olav Stetter

    Full Text Available A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting. Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections

  12. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  13. Polyphase Order Analysis Based on Convolutional Approach

    OpenAIRE

    M. Drutarovsky

    1999-01-01

    The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing) rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational ...

  14. ANALYSIS OF CIRCUIT TOLERANCE BASED ON RANDOM SET THEORY

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Monte Carlo Analysis has been an accepted method for circuit tolerance analysis,but the heavy computational complexity has always prevented its applications.Based on random set theory,this paper presents a simple and flexible tolerance analysis method to estimate circuit yield.It is the alternative to Monte Carlo analysis,but reduces the number of calculations dramatically.

  15. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it...

  16. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  17. Location-based Modeling and Analysis: Tropos-based Approach

    OpenAIRE

    Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo

    2008-01-01

    The continuous growth of interest in mobile applications makes the concept of location essential to design and develop software systems. Location-based software is supposed to be able to monitor the location and choose accordingly the most appropriate behavior. In this paper, we propose a novel conceptual framework to model and analyze location-based software. We mainly focus on the social facets of locations adopting concepts such as social actor, resource, and location-based behavior. Our a...

  18. Statistical Analysis of Nonlinear Processes Based on Penalty Factor

    OpenAIRE

    Zhang, Yingwei; Zhang, Chuanfang; Zhang, Wei

    2014-01-01

    A new process monitoring approach is proposed for handling the nonlinear monitoring problem in the electrofused magnesia furnace (EFMF). Compared to conventional method, the contributions are as follows: (1) a new kernel principal component analysis is proposed based on loss function in the feature space; (2) the model of kernel principal component analysis based on forgetting factor is updated; (3) a new iterative kernel principal component analysis algorithm is proposed based on penalty fac...

  19. Web Based Distributed Coastal Image Analysis System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  20. Pathway-Based Functional Analysis of Metagenomes

    Science.gov (United States)

    Bercovici, Sivan; Sharon, Itai; Pinter, Ron Y.; Shlomi, Tomer

    Metagenomic data enables the study of microbes and viruses through their DNA as retrieved directly from the environment in which they live. Functional analysis of metagenomes explores the abundance of gene families, pathways, and systems, rather than their taxonomy. Through such analysis researchers are able to identify those functional capabilities most important to organisms in the examined environment. Recently, a statistical framework for the functional analysis of metagenomes was described that focuses on gene families. Here we describe two pathway level computational models for functional analysis that take into account important, yet unaddressed issues such as pathway size, gene length and overlap in gene content among pathways. We test our models over carefully designed simulated data and propose novel approaches for performance evaluation. Our models significantly improve over current approach with respect to pathway ranking and the computations of relative abundance of pathways in environments.

  1. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  2. Market Based Analysis of Power System Interconnections

    OpenAIRE

    Obuševs, A; Turcik, M; Oļeiņikova, I; Junghāns, G

    2011-01-01

    Analysis in this Article is focused on usage of transmission grid under liberalized market with implicit transmission capacity allocation method, e.g. Nordic market. Attention is paid on fundamental changes in transmission utilization and its economical effective operation. For interconnection and power flow analysis and losses calculation model of Nordic grid was developed and transmission losses calculation method was created. Given approach will improve economical efficiency of system oper...

  3. Numerical analysis of free-burning argon arcs based on the local thermodynamic equilibrium model at various electrical currents

    International Nuclear Information System (INIS)

    Free-burning arcs where the work piece acts as an anode were numerically analyzed using a computational domain including the arc itself and its anode region based on the local thermodynamic equilibrium model. Because the major arc parameters such as temperature, axial velocity, electric potential difference and pressure-rise from ambient atmospheric pressure are much dependent on the working current, our investigation was concerned with developing a capability to model free-burning argon arcs and considering the energy flux going into the anode at various values of the electrical current (I = 50, 100 and 200 A) by computational fluid dynamics analysis. Predicted temperatures along the z-axis between the electrodes were in fair agreement with existing experimental results. Particularly, reasonable relationships between the maximum velocity or temperature and the applied current were predicted, which matched well with other theoretical results. In addition, some discrepancies with other predictions were shown in the results about electric potential and pressure-rise. It should be related to the omission of the space-charge effect near the electrodes for a simplified unified model and the application of a turbulence model for the steep temperature gradient at the arc edges. - Highlights: • Free-burning argon arcs were investigated at various working currents numerically. • The relationships between the current, the velocity and the temperature were found. • Some discrepancies were shown in the results of pressure and electric potential. • Those should be supplemented by the non-equilibrium situation near electrodes

  4. Description-based and experience-based decisions: individual analysis

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2012-05-01

    Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.

  5. Iris recognition based on subspace analysis

    Directory of Open Access Journals (Sweden)

    Pravin S.Patil

    2014-10-01

    Full Text Available Biometrics deals with the uniqueness of an individual arising from their physiological or behavioral characteristics for the purpose of personal identification. Among many biometrics techniques, iris recognition is one of the most promising approache. This paper presents traditional subspace analysis method for iris recognition. Initially the eye images have been localized in circular form by using Daugman’s grid method and circular Hough transform method. The algorithms for subspace analysis methods namely PCA and LDA are implemented and experimental results are reported. The comparative performance for both the algorithms has been observed in term of recognition rate. The comprehensive experiments completed on UPOL and CASIA V1 iris databases.

  6. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits....

  7. Environmentally based Cost-Benefit Analysis

    International Nuclear Information System (INIS)

    The fundamentals of the basic elements of a new comprehensive economic assessment, MILA, developed in Sweden with inspiration from the Total Cost Assessment-model are presented. The core of the MILA approach is an expanded cost and benefit inventory. But MILA also includes a complementary addition of an internal waste stream analysis, a tool for evaluation of environmental conflicts in monetary terms, an extended time horizon and direct allocation of costs and revenues to products and processes. However, MILA does not ensure profitability for environmentally sound projects. Essentially, MILA is an approach of refining investment and profitability analysis of a project, investment or product. 109 refs., 38 figs

  8. Gender-Based Analysis On-Line Dialogue. Final Report.

    Science.gov (United States)

    2001

    An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with special…

  9. Science Based Governance? EU Food Regulation Submitted to Risk Analysis

    NARCIS (Netherlands)

    Szajkowska, A.; Meulen, van der B.M.J.

    2014-01-01

    Anna Szajkowska and Bernd van der Meulen analyse in their contribution, Science Based Governance? EU Food Regulation Submitted to Risk Analysis, the scope of application of risk analysis and the precautionary principle in EU food safety regulation. To what extent does this technocratic, science-base

  10. SSI Analysis for Base-Isolated Nuclear Power Plants

    International Nuclear Information System (INIS)

    Safety of NPPs much higher than other structures is required. An earthquake is one of the most important parameters which govern safety of NPPs among external events. Application of base isolation system for NPPs can reduce the risk for earthquakes. At present, a soil structure interaction(SSI) analysis is essential in seismic design of NPPs in consideration of ground structure interaction. In the seismic analysis of the base-isolated NPP, it is restrictive to consider nonlinear properties of seismic isolation bearings due to linear analysis of SSI analysis programs such as SASSI. Thus, in this study, SSI analyses are performed using an iterative approach considering material nonlinearity of isolators. By performing the SSI analysis using an iterative approach, nonlinear properties of isolators can be considered. The results of the SSI analysis show that the response of the base-isolated NPP with base isolation systems is significantly reduced horizontally

  11. Analysis of Financial Position Based on the Balance Sheet

    OpenAIRE

    Spineanu-Georgescu Luciana

    2011-01-01

    Analysis of financial position based on the balance sheet is mainly aimed at assessing the extent to which financial structure chosen by the firm, namely, financial resources, covering the needs reflected in the balance sheet financed. This is done through an analysis known as horizontal analysis balance sheet financial imbalances.

  12. Transportation Mode Choice Analysis Based on Classification Methods

    OpenAIRE

    Zeņina, N; Borisovs, A

    2011-01-01

    Mode choice analysis has received the most attention among discrete choice problems in travel behavior literature. Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. This paper investigates performance of mode choice analysis with classification methods - decision trees, discriminant analysis and multinomial logit. Experimental results have demonstrated satisfactory quality of classification.

  13. Encounter-based worms: Analysis and Defense

    CERN Document Server

    Tanachaiwiwat, Sapon

    2007-01-01

    Encounter-based network is a frequently-disconnected wireless ad-hoc network requiring immediate neighbors to store and forward aggregated data for information disseminations. Using traditional approaches such as gateways or firewalls for deterring worm propagation in encounter-based networks is inappropriate. We propose the worm interaction approach that relies upon automated beneficial worm generation aiming to alleviate problems of worm propagations in such networks. To understand the dynamic of worm interactions and its performance, we mathematically model worm interactions based on major worm interaction factors including worm interaction types, network characteristics, and node characteristics using ordinary differential equations and analyze their effects on our proposed metrics. We validate our proposed model using extensive synthetic and trace-driven simulations. We find that, all worm interaction factors significantly affect the pattern of worm propagations. For example, immunization linearly decrea...

  14. Accelerator based techniques for aerosol analysis

    International Nuclear Information System (INIS)

    At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)

  15. Structural Analysis of Plate Based Tensegrity Structures

    DEFF Research Database (Denmark)

    Hald, Frederik; Kirkegaard, Poul Henning; Damkilde, Lars

    2013-01-01

    Plate tensegrity structures combine tension cables with a cross laminated timber plate and can then form e.g. a roof structure. The topology of plate tensegrity structures is investigated through a parametric investigation. Plate tensegrity structures are investigated, and a method for...... determination of the structures pre-stresses is used. A parametric investigation is performed to determine a more optimized form of the plate based tensegrity structure. Conclusions of the use of plate based tensegrity in civil engineering and further research areas are discussed....

  16. Symbolic Analysis of OTRAs-Based Circuits

    Directory of Open Access Journals (Sweden)

    C. Sánchez-López

    2011-04-01

    Full Text Available A new nullor-based model to describe the behavior of Operational Transresistance Amplifiers (OTRAs is introduced.The new model is composed of four nullors and three grounded resistors. As a consequence, standard nodal analysiscan be applied to compute fully-symbolic small-signal characteristics of OTRA-based analog circuits, and the nullorbasedOTRAs model can be used in CAD tools. In this manner, the fully-symbolic transfer functions of severalapplication circuits, such as filters and oscillators can easily be approximated.

  17. Iris recognition based on subspace analysis

    OpenAIRE

    Pravin S.Patil

    2014-01-01

    Biometrics deals with the uniqueness of an individual arising from their physiological or behavioral characteristics for the purpose of personal identification. Among many biometrics techniques, iris recognition is one of the most promising approache. This paper presents traditional subspace analysis method for iris recognition. Initially the eye images have been localized in circular form by using Daugman’s grid method and circular Hough transform method. The algorithms for subspace analy...

  18. Texton-based analysis of paintings

    Science.gov (United States)

    van der Maaten, Laurens J. P.; Postma, Eric O.

    2010-08-01

    The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of

  19. CLUSTERING-BASED ANALYSIS OF TEXT SIMILARITY

    OpenAIRE

    Bovcon , Borja

    2013-01-01

    The focus of this thesis is comparison of analysis of text-document similarity using clustering algorithms. We begin by defining main problem and then, we proceed to describe the two most used text-document representation techniques, where we present words filtering methods and their importance, Porter's algorithm and tf-idf term weighting algorithm. We then proceed to apply all previously described algorithms on selected data-sets, which vary in size and compactness. Fallowing this, we ...

  20. Wavelet Based Fractal Analysis of Airborne Pollen

    OpenAIRE

    Degaudenzi, M. E.; Arizmendi, C. M.

    1998-01-01

    The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen tim...

  1. Thanatophoric dysplasia: case-based bioethical analysis

    Directory of Open Access Journals (Sweden)

    Edgar Abarca López

    2014-04-01

    Full Text Available This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  2. Thanatophoric dysplasia: case-based bioethical analysis

    OpenAIRE

    Edgar Abarca López; Alejandra Rodríguez Torres; Donovan Casas Patiño; Esteban Espíndola Benítez

    2014-01-01

    This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  3. Movement Pattern Analysis Based on Sequence Signatures

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Chavoshi

    2015-09-01

    Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.

  4. Remote sensing based on hyperspectral data analysis

    Science.gov (United States)

    Sharifahmadian, Ershad

    In remote sensing, accurate identification of far objects, especially concealed objects is difficult. In this study, to improve object detection from a distance, the hyperspecral imaging and wideband technology are employed with the emphasis on wideband radar. As the wideband data includes a broad range of frequencies, it can reveal information about both the surface of the object and its content. Two main contributions are made in this study: 1) Developing concept of return loss for target detection: Unlike typical radar detection methods which uses radar cross section to detect an object, it is possible to enhance the process of detection and identification of concealed targets using the wideband radar based on the electromagnetic characteristics --conductivity, permeability, permittivity, and return loss-- of materials. During the identification process, collected wideband data is evaluated with information from wideband signature library which has already been built. In fact, several classes (e.g. metal, wood, etc.) and subclasses (ex. metals with high conductivity) have been defined based on their electromagnetic characteristics. Materials in a scene are then classified based on these classes. As an example, materials with high electrical conductivity can be conveniently detected. In fact, increasing relative conductivity leads to a reduction in the return loss. Therefore, metals with high conductivity (ex. copper) shows stronger radar reflections compared with metals with low conductivity (ex. stainless steel). Thus, it is possible to appropriately discriminate copper from stainless steel. 2) Target recognition techniques: To detect and identify targets, several techniques have been proposed, in particular the Multi-Spectral Wideband Radar Image (MSWRI) which is able to localize and identify concealed targets. The MSWRI is based on the theory of robust capon beamformer. During identification process, information from wideband signature library is utilized

  5. Google glass based immunochromatographic diagnostic test analysis

    Science.gov (United States)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  6. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  7. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels;

    2010-01-01

    directly for implementation into a computer aided reasoning tool for HAZOP studies to perform root cause and consequence analysis. Such a tool will facilitate finding causes far away from the site of the deviation. A Functional HAZOP Assistant is proposed and investigated in a HAZOP study of an industrial...... to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of...

  8. Communication Error Analysis Method based on CREAM

    International Nuclear Information System (INIS)

    Communication error has been considered as a primary reason of many incidents and accidents in nuclear industry. In order to prevent these accidents, an analysis method of communication errors is proposed. This study presents a qualitative method to analyze communication errors. The qualitative method focuses on finding a root cause of the communication error and predicting the type of communication error which could happen in nuclear power plants. We develop context conditions and antecedent-consequent links of influential factors related to communication error. A case study has been conducted to validate the applicability of the proposed methods

  9. Web Application Comprehension Based on Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    WU Jun-hua; XU Bao-wen; JIANG Ji-xiang

    2004-01-01

    Many research indicate a lot of money and time are spent on maintaining and modifying program delivered.So the policies to support program comprehension are very important.Program comprehension is a crucial and difficult task.Insufficient design, illogical code structure, short documents will enhance the comprehensive difficulty.Developing Web application is usually a process with quick implementation and delivery.In addition, generally a Web application is coded by combining mark language statements with some embedded applets.Such programming mode affects comprehension of Web applications disadvantageously.This paper proposes a method to improving understanding Web by dependence analysis and slice technology.

  10. Analysis and Protection of SIP based Services

    OpenAIRE

    Ferdous, Raihana

    2014-01-01

    Multimedia communications over IP are booming as they offer higher flexibility and more features than traditional voice and video services. IP telephony known as Voice over IP (VoIP) is one of the commercially most important emerging trends in multimedia communications over IP. Due to the flexibility and descriptive power, the Session Initiation Protocol (SIP) is becoming the root of many sessions-based applications such as VoIP and media streaming that are used by a growing number of use...

  11. Analysis of Hashrate-Based Double Spending

    OpenAIRE

    Rosenfeld, Meni

    2014-01-01

    Bitcoin is the world's first decentralized digital currency. Its main technical innovation is the use of a blockchain and hash-based proof of work to synchronize transactions and prevent double-spending the currency. While the qualitative nature of this system is well understood, there is widespread confusion about its quantitative aspects and how they relate to attack vectors and their countermeasures. In this paper we take a look at the stochastic processes underlying typical attacks and th...

  12. Interest Based Financial Intermediation: Analysis and Solutions

    OpenAIRE

    Shaikh, Salman

    2012-01-01

    Interest is prohibited in all monotheist religions. Apart from religion, interest is also regarded as unjust price of money capital by pioneer secular philosophers as well as some renowned economists. However, it is argued by some economists that modern day, market driven interest rate in a competitive financial market is different from usury and that the interest based financial intermediation has served a useful purpose in allocation of resources as well as in allocation of risk, given the ...

  13. Value-Based Analysis of Mobile Tagging

    OpenAIRE

    Oguzhan Aygoren; Kaan Varnali

    2011-01-01

    Innovative use of the mobile medium in delivering customer value presents unprecedented opportunities for marketers. Various types of mobile applications have evolved to provide ubiquitous and instant customer service to capitalize on this opportunity. One application is mobile tagging, a mobile-based innovative tool for convergence marketing. The accumulated academic knowledge on mobile marketing lacks consumer-centric information about this phenomenon. This paper addresses this issue and co...

  14. Confidence-Based Learning in Investment Analysis

    Science.gov (United States)

    Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés

    The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.

  15. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    OpenAIRE

    Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo

    2003-01-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...

  16. Engineering Analysis Using a Web-based Protocol

    Science.gov (United States)

    Schoeffler, James D.; Claus, Russell W.

    2002-01-01

    This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.

  17. Transfer entropy--a model-free measure of effective connectivity for the neurosciences.

    Science.gov (United States)

    Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon

    2011-02-01

    Understanding causal relationships, or effective connectivity, between parts of the brain is of utmost importance because a large part of the brain's activity is thought to be internally generated and, hence, quantifying stimulus response relationships alone does not fully describe brain dynamics. Past efforts to determine effective connectivity mostly relied on model based approaches such as Granger causality or dynamic causal modeling. Transfer entropy (TE) is an alternative measure of effective connectivity based on information theory. TE does not require a model of the interaction and is inherently non-linear. We investigated the applicability of TE as a metric in a test for effective connectivity to electrophysiological data based on simulations and magnetoencephalography (MEG) recordings in a simple motor task. In particular, we demonstrate that TE improved the detectability of effective connectivity for non-linear interactions, and for sensor level MEG signals where linear methods are hampered by signal-cross-talk due to volume conduction. PMID:20706781

  18. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  19. Network Anomaly Detection Based on Wavelet Analysis

    Science.gov (United States)

    Lu, Wei; Ghorbani, Ali A.

    2008-12-01

    Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  20. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  1. Mental EEG Analysis Based on Infomax Algorithm

    Institute of Scientific and Technical Information of China (English)

    WUXiao-pei; GuoXiao-jing; ZANGDao-xin; SHENQian

    2004-01-01

    The patterns of EEG will change with mental tasks performed by the subject. In the field of EEG signal analysis and application, the study to get the patterns of mental EEG and then to use them to classify mental tasks has the significant scientific meaning and great application value. But for the reasons of different artifacts existing in EEG, the pattern detection of EEG under normal mental states is a very difficult problem. In this paper, Independent Component Analysisis applied to EEG signals collected from performing different mental tasks. The experiment results show that when one subject performs a single mental task in different trials, the independent components of EEG are very similar. It means that the independent components can be used as the mental EEG patterns to classify the different mental tasks.

  2. Wavelet Based Fractal Analysis of Airborne Pollen

    CERN Document Server

    Degaudenzi, M E

    1999-01-01

    The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen time series indicate that the system can be described by a low dimensional chaotic map. We apply the wavelet transform to study the multifractal characteristics of an a irborne pollen time series. We find the persistence behaviour associated to low pollen concentration values and to the most rare events of highest pollen co ncentration values. The information and the correlation dimensions correspond to a chaotic system showing loss of information with time evolution.

  3. Computational based functional analysis of Bacillus phytases.

    Science.gov (United States)

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  4. Face Recognition Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2013-02-01

    Full Text Available The purpose of the proposed research work is to develop a computer system that can recognize a person by comparing the characteristics of face to those of known individuals. The main focus is on frontal two dimensional images that are taken in a controlled environment i.e. the illumination and the background will be constant. All the other methods of person’s identification and verification like iris scan or finger print scan require high quality and costly equipment’s but in face recognition we only require a normal camera giving us a 2-D frontal image of the person that will be used for the process of the person’s recognition. Principal Component Analysis technique has been used in the proposed system of face recognition. The purpose is to compare the results of the technique under the different conditions and to find the most efficient approach for developing a facial recognition system

  5. Building Extraction from LIDAR Based Semantic Analysis

    Institute of Scientific and Technical Information of China (English)

    YU Jie; YANG Haiquan; TAN Ming; ZHANG Guoning

    2006-01-01

    Extraction of buildings from LIDAR data has been an active research field in recent years. A scheme for building detection and reconstruction from LIDAR data is presented with an object-oriented method which is based on the buildings' semantic rules. Two key steps are discussed: how to group the discrete LIDAR points into single objects and how to establish the buildings' semantic rules. In the end, the buildings are reconstructed in 3D form and three common parametric building models (flat, gabled, hipped) are implemented.

  6. Trajectory Based Behavior Analysis for User Verification

    Science.gov (United States)

    Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah

    Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. The Route Analysis Based On Flight Plan

    Science.gov (United States)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  9. Network reliability analysis based on percolation theory

    International Nuclear Information System (INIS)

    In this paper, we propose a new way of looking at the reliability of a network using percolation theory. In this new view, a network failure can be regarded as a percolation process and the critical threshold of percolation can be used as network failure criterion linked to the operational settings under control. To demonstrate our approach, we consider both random network models and real networks with different nodes and/or edges lifetime distributions. We study numerically and theoretically the network reliability and find that the network reliability can be solved as a voting system with threshold given by percolation theory. Then we find that the average lifetime of random network increases linearly with the average lifetime of its nodes with uniform life distributions. Furthermore, the average lifetime of the network becomes saturated when system size is increased. Finally, we demonstrate our method on the transmission network system of IEEE 14 bus. - Highlights: • Based on percolation theory, we address questions of practical interest such as “how many failed nodes/edges will break down the whole network?” • The percolation threshold naturally gives a network failure criterion. • The approach based on percolation theory is suited for calculations of large-scale networks

  10. Design Intelligent Model-free Hybrid Guidance Controller for Three Dimension Motor

    Directory of Open Access Journals (Sweden)

    Abdol Majid Mirshekaran

    2014-10-01

    Full Text Available The minimum rule base Proportional Integral Derivative (PID Fuzzy hybrid guidance Controller for three dimensions spherical motor is presented in this research. A three dimensions spherical motor is well equipped with conventional control techniques and, in particular, various PID controllers which demonstrate a good performance and successfully solve different guidance problems. Guidance control in a three dimensions spherical motor is performed by the PID controllers producing the control signals which are applied to systems torque. The necessary reference inputs for a PID controller are usually supplied by the system's sensors based on different data. The popularity of PID Fuzzy hybrid guidance Controller can be attributed to their robust performance in a wide range of operating conditions and partly to their functional simplicity. PID methodology has three inputs and if any input is described with seven linguistic values, and any rule has three conditions we will need 343 rules. It is too much work to write 343 rules. In this research the PID-like fuzzy controller can be constructed as a parallel structure of a PD-like fuzzy controller and a conventional PI controller to have the minimum rule base. Linear type PID controller is used to modify PID fuzzy logic theory to design hybrid guidance methodology. This research is used to reduce or eliminate the fuzzy and conventional PID controller problem based on minimum rule base fuzzy logic theory and modified it by PID method to control of spherical motor system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  11. SENTIMENT ANALYSIS OF DOCUMENT BASED ON ANNOTATION

    Directory of Open Access Journals (Sweden)

    Archana Shukla

    2011-11-01

    Full Text Available I present a tool which tells the quality of document or its usefulness based on annotations. Annotation mayinclude comments, notes, observation, highlights, underline, explanation, question or help etc. commentsare used for evaluative purpose while others are used for summarization or for expansion also. Furtherthese comments may be on another annotation. Such annotations are referred as meta-annotation. Allannotation may not get equal weightage. My tool considered highlights, underline as well as comments toinfer the collective sentiment of annotators. Collective sentiments of annotators are classified as positive,negative, objectivity. My tool computes collective sentiment of annotations in two manners. It counts all theannotation present on the documents as well as it also computes sentiment scores of all annotation whichincludes comments to obtain the collective sentiments about the document or to judge the quality ofdocument. I demonstrate the use of tool on research paper.

  12. Analysis of Vehicle-Based Security Operations

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jason M [ORNL; Paul, Nate R [ORNL

    2015-01-01

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be

  13. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  14. Structural Analysis Using Computer Based Methods

    Science.gov (United States)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  15. Mammogram-based discriminant fusion analysis for breast cancer diagnosis.

    Science.gov (United States)

    Li, Jun-Bao; Wang, Yun-Heng; Tang, Lin-Lin

    2012-01-01

    Mammogram-based classification is an important and effective way for computer-aided diagnosis (CAD)-based breast cancer diagnosis. In this paper, we present a novel discriminant fusing analysis (DFA)-based mammogram classification CAD-based breast cancer diagnosis. The discriminative breast tissue features are exacted and fused by DFA, and DFA achieves the optimal fusion coefficients. The largest class discriminant in the fused feature space is achieved by DFA for classification. Beside the detailed theory derivation, many experimental evaluations are implemented on Mammography Image Analysis Society mammogram database for breast cancer diagnosis. PMID:23153999

  16. Bayesian analysis for EMP damaged function based on Weibull distribution

    International Nuclear Information System (INIS)

    Weibull distribution is one of the most commonly used statistical distribution in EMP vulnerability analysis. In the paper, the EMP damage function based on Weibull distribution of solid state relays was solved by bayesian computation using gibbs sampling algorithm. (authors)

  17. Indoor air quality analysis based on Hadoop

    International Nuclear Information System (INIS)

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper

  18. Indoor air quality analysis based on Hadoop

    Science.gov (United States)

    Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui

    2014-03-01

    The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.

  19. Surveillance data bases, analysis, and standardization program

    Energy Technology Data Exchange (ETDEWEB)

    Kam, F.B.K.

    1990-09-26

    The traveler presented a paper at the Seventh ASTM-EURATOM Symposium on Reactor Dosimetry and co-chaired an oral session on Computer Codes and Methods. Papers of considerable interest to the NRC Surveillance Dosimetry Program involved statistically based adjustment procedures and uncertainties. The information exchange meetings with Czechoslovakia and Hungary were very enlightening. Lack of large computers have hindered their surveillance program. They depended very highly on information from their measurement programs which were somewhat limited because of the lack of sophisticated electronics. The Nuclear Research Institute at Rez had to rely on expensive mockups of power reactor configurations to test their fluence exposures. Computers, computer codes, and updated nuclear data would advance their technology rapidly, and they were not hesitant to admit this fact. Both eastern-bloc countries said that IBM is providing an IBM 3090 for educational purposes but research and development studies would have very limited access. They were very apologetic that their currencies were not convertible, and any exchange means that they could provide services or pay for US scientists in their respective countries, but funding for their scientists in the United States, or expenses that involved payment in dollars, must come from us.

  20. Activation analysis based on secondary nuclear reactions

    International Nuclear Information System (INIS)

    Various types of analytical techniques founded on achievements of nuclear physics are used. There are two directions of the using of the main sources of the nuclear projectiles at development of the nuclear methods. In the first, the particles from the source are used directly for the excitation of nuclear reactions. In the second, the particles from the source are used for the generating of intermediate particles of other types which are used in turn for excitation of secondary nuclear reactions. In our research the neutrons are used for the generating of secondary charged particles which serve for excitation of nuclear reactions on elements with small atomic numbers. There are two variants in which both types of neutrons, as thermal, so and fast neutrons are used: 1) The triton flow is produced by thermal neutrons flux, which excites the nuclear reaction 6Li(n, α)T on lithium; 2) The recoil protons are produced as the result of (n, p) elastic or inelastic scattering interaction of fast neutrons with nucleus of light elements, for example, hydrogen. In this work the theoretical base of the application of secondary nuclear reactions excited by recoil protons was investigated

  1. On spectral methods for variance based sensitivity analysis

    OpenAIRE

    Alexanderian, Alen

    2013-01-01

    Consider a mathematical model with a finite number of random parameters. Variance based sensitivity analysis provides a framework to characterize the contribution of the individual parameters to the total variance of the model response. We consider the spectral methods for variance based sensitivity analysis which utilize representations of square integrable random variables in a generalized polynomial chaos basis. Taking a measure theoretic point of view, we provide a rigorous and at the sam...

  2. Architecture Level Dependency Analysis of SOA Based System through ?-Adl

    OpenAIRE

    Pawan Kumar; Ratneshwer

    2016-01-01

    A formal Architecture Description Language (ADL) provides an effective way to dependency analysis at early stage of development. ?-ADL is an ADL that represents the static and dynamic features of software services. In this paper, we describe an approach of dependency analysis of SOA (Service Oriented Architecture) based system, at architecture level, through ?-ADL. A set of algorithms are also proposed for identification of dependency relationships from a SOA based system. The proposed algori...

  3. Toward farm-based policy analysis: concepts applied in Haiti

    OpenAIRE

    Martinez, Juan Carlos; Sain, Gustavo; Yates, Michael

    1991-01-01

    Many policies - on the delivery of inputs or on marketing systems, credit, or extension - influence the potential utilization of new technologies. Through 'farm-based policy analysis' it is possible to use data generated in on-farm research (OFR) to identify policy constraints to the use of new technologies, and to effectively communicate that information to policy makers. This paper describes a tentative framework for farm-based policy analysis and suggests a sequence of five steps for the a...

  4. Multilevel Solvers with Aggregations for Voxel Based Analysis of Geomaterials

    OpenAIRE

    Blaheta, R. (Radim); V. Sokol

    2012-01-01

    Our motivation for voxel based analysis comes from the investigation of geomaterials (geocomposites) arising from rock grouting or sealing. We use finite element analysis based on voxel data from tomography. The arising finite element systems are large scale, which motivates the use of multilevel iterative solvers or preconditioners. Among others we concentrate on multilevel Schwarz preconditioners with aggregations. The aggregations are efficient even in the case of problems with hete...

  5. Product Profitability Analysis Based on EVA and ABC

    OpenAIRE

    Chen Lin; Shuangyuan Wang; Zhilin Qiao

    2013-01-01

    On the purpose of maximizing shareholders’ value, the profitability analysis established on the basis oftraditional accounting earnings cannot meet the demands of providing accurate decision-making information forenterprises. Therefore, this paper implements the Activity Based Costing (ABC) and the Economic Value Added(EVA) into the traditional profitability analysis system, sets up an improved EVA-ABC based profitabilityanalysis system as well as its relative indexes, and applies it to the s...

  6. Empirical validation and comparison of models for customer base analysis

    OpenAIRE

    Persentili Batislam, Emine; Denizel, Meltem; Filiztekin, Alpay

    2007-01-01

    The benefits of retaining customers lead companies to search for means to profile their customers individually and track their retention and defection behaviors. To this end, the main issues addressed in customer base analysis are identification of customer active/inactive status and prediction of future purchase levels. We compare the predictive performance of Pareto/NBD and BG/NBD models from the customer base analysis literature — in terms of repeat purchase levels and active status — usi...

  7. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  8. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  9. PYTHON-based Physics Analysis Environment for LHCb

    CERN Document Server

    Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E

    2004-01-01

    BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.

  10. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  11. An IBM-PC based reactor neutronics analysis package

    International Nuclear Information System (INIS)

    The development of a comprehensive system of microcomputer-based codes suitable for neutronics and shielding analysis of nuclear reactors has been undertaken by EGandG Idaho, Inc., at the Idaho National Engineering Laboratory (INEL). This system has been designed for cross section generation, one-dimensional discrete-ordinates analysis, one- two- and three-dimensional diffusion theory analysis, and various other radiation transport applications of interest

  12. Benefits of Computer Based Content Analysis to Foresight

    OpenAIRE

    Kováříková, Ludmila; Grosová, Stanislava

    2014-01-01

    Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...

  13. Optimisation of NMR dynamic models II. A new methodology for the dual optimisation of the model-free parameters and the Brownian rotational diffusion tensor

    International Nuclear Information System (INIS)

    Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported

  14. Geometric-model-free tracking of extended targets using 3D lidar measurements

    Science.gov (United States)

    Steinemann, Philipp; Klappstein, Jens; Dickmann, Juergen; von Hundelshausen, Felix; Wünsche, Hans-Joachim

    2012-06-01

    Tracking of extended targets in high definition, 360-degree 3D-LIDAR (Light Detection and Ranging) measurements is a challenging task and a current research topic. It is a key component in robotic applications, and is relevant to path planning and collision avoidance. This paper proposes a new method without a geometric model to simultaneously track and accumulate 3D-LIDAR measurements of an object. The method itself is based on a particle filter and uses an object-related local 3D grid for each object. No geometric object hypothesis is needed. Accumulation allows coping with occlusions. The prediction step of the particle filter is governed by a motion model consisting of a deterministic and a probabilistic part. Since this paper is focused on tracking ground vehicles, a bicycle model is used for the deterministic part. The probabilistic part depends on the current state of each particle. A function for calculating the current probability density function for state transition is developed. It is derived in detail and based on a database consisting of vehicle dynamics measurements over several hundreds of kilometers. The adaptive probability density function narrows down the gating area for measurement data association. The second part of the proposed method addresses weighting the particles with a cost function. Different 3D-griddependent cost functions are presented and evaluated. Evaluations with real 3D-LIDAR measurements show the performance of the proposed method. The results are also compared to ground truth data.

  15. Some Linguistic-based and temporal analysis on Wikipedia

    International Nuclear Information System (INIS)

    Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)

  16. Alignment analysis of urban railways based on passenger travel demand

    DEFF Research Database (Denmark)

    Andersen, Jonas Lohmann Elkjær; Landex, Alex

    2010-01-01

    , this article presents a computerised GIS based methodology that can be used as decision support for selecting the best alignment. The methodology calculates travel potential within defined buffers surrounding the alignment. The methodology has three different approaches depending on the desired level of detail......: the simple but straight-forward to implement line potential approach that perform corridor analysis, the detailed catchment area analysis based on stops on the alignment and the refined service area analysis that uses search distances in street networks. All three approaches produce trustworthy results...

  17. Chemical Cytometry: Fluorescence-Based Single-Cell Analysis

    Science.gov (United States)

    Cohen, Daniella; Dickerson, Jane A.; Whitmore, Colin D.; Turner, Emily H.; Palcic, Monica M.; Hindsgaul, Ole; Dovichi, Norman J.

    2008-07-01

    Cytometry deals with the analysis of the composition of single cells. Flow and image cytometry employ antibody-based stains to characterize a handful of components in single cells. Chemical cytometry, in contrast, employs a suite of powerful analytical tools to characterize a large number of components. Tools have been developed to characterize nucleic acids, proteins, and metabolites in single cells. Whereas nucleic acid analysis employs powerful polymerase chain reaction-based amplification techniques, protein and metabolite analysis tends to employ capillary electrophoresis separation and ultrasensitive laser-induced fluorescence detection. It is now possible to detect yoctomole amounts of many analytes in single cells.

  18. Model-free information-theoretic approach to infer leadership in pairs of zebrafish

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  19. Fatigue analysis of steam generator cassette parts based on CAE

    International Nuclear Information System (INIS)

    Fatigue analysis has been performed for steam generator nozzle header and tube based on CAE. Three dimensional model was produced using the commercial CAD program, IDEAS and the geometry and boundary condition information have been transformed into input format of ABAQUS for thermal analysis, stress analysis, and fatigue analysis. Cassette nozzle, which has a complex geometry, has been analysed by using the three dimensional model. But steam generator tube has been analysed according to ASME procedure since it can be modelled as a two dimensional finite element model. S-N curve for the titanium alloy of the steam generator tube material was obtained from the material tests. From the analysis, it has been confirmed that these parts of the steam generator cassette satisfy the lifetime of the steam generator cassette. Three dimensional modelling strategy from the thermal analysis to fatigue analysis should be implemented into the design of reactor major components to enhance the efficiency of design procedure

  20. Statistical analysis of MRI-only based dose planning

    DEFF Research Database (Denmark)

    Korsholm, M. E.; Waring, L. W.; Paulsen, Rasmus Reinhold; Edmund, J. M.

    2 %. Conclusions: The investigated DVH points show that MRIonly based RT seems to be a feasible alternative to CT based RT. However, the analysis only describes similarities in DVH points and not in the shape of the DVH. Even though the mean differences are nonsignificant there might be...

  1. Agent-based analysis of organizations : formalization and simulation

    OpenAIRE

    Dignum, M.V.; Tick, C.

    2008-01-01

    Organizational effectiveness depends on many factors, including individual excellence, efficient structures, effective planning and capability to understand and match context requirements. We propose a way to model organizational performance based on a combination of formal models and agent-based simulation that supports the analysis of the congruence of different organizational structures to changing environments

  2. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;

    2011-01-01

    complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness is...

  3. An Analysis of an Improved Bus-Based Multiprocessor Architecture

    Science.gov (United States)

    Ricks, Kenneth G.; Wells, B. Earl

    1998-01-01

    This paper analyses the effectiveness of a hybrid multiprocessing/multicomputing architecture that is based upon a single-board-computer multiprocessor (SBCM) architecture. Based upon empirical analysis using discrete event simulations and Monte Carlo techniques, this hybrid architecture, called the enhanced single-board-computer multiprocessor (ESBCM), is shown to have improved performance and scalability characteristics over current SBCM designs.

  4. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research.

    Science.gov (United States)

    Li, Junyi; Li, Yi-Xue; Li, Yuan-Yuan

    2016-01-01

    With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA) based on gene coexpression network (GCN) increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies. PMID:27597964

  5. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research

    Directory of Open Access Journals (Sweden)

    Junyi Li

    2016-01-01

    Full Text Available With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA based on gene coexpression network (GCN increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies.

  6. Kernel-Based Nonlinear Discriminant Analysis for Face Recognition

    Institute of Scientific and Technical Information of China (English)

    LIU QingShan (刘青山); HUANG Rui (黄锐); LU HanQing (卢汉清); MA SongDe (马颂德)

    2003-01-01

    Linear subspace analysis methods have been successfully applied to extract features for face recognition. But they are inadequate to represent the complex and nonlinear variations of real face images, such as illumination, facial expression and pose variations, because of their linear properties. In this paper, a nonlinear subspace analysis method, Kernel-based Nonlinear Discriminant Analysis (KNDA), is presented for face recognition, which combines the nonlinear kernel trick with the linear subspace analysis method - Fisher Linear Discriminant Analysis (FLDA).First, the kernel trick is used to project the input data into an implicit feature space, then FLDA is performed in this feature space. Thus nonlinear discriminant features of the input data are yielded. In addition, in order to reduce the computational complexity, a geometry-based feature vectors selection scheme is adopted. Another similar nonlinear subspace analysis is Kernel-based Principal Component Analysis (KPCA), which combines the kernel trick with linear Principal Component Analysis (PCA). Experiments are performed with the polynomial kernel, and KNDA is compared with KPCA and FLDA. Extensive experimental results show that KNDA can give a higher recognition rate than KPCA and FLDA.

  7. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes

    Science.gov (United States)

    Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G.

    2016-01-01

    Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367

  8. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Science.gov (United States)

    Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G

    2016-01-01

    Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367

  9. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Directory of Open Access Journals (Sweden)

    Elena Daskalaki

    Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.

  10. A model-free definition of coupling strength for assessing the influence between climatic processes

    Science.gov (United States)

    Runge, J.; Kurths, J.

    2012-04-01

    Assessing the strength of influence between climatic processes from observational data is an important problem on the way to construct conceptual models or make predictions. An example being the influence of ENSO on the Indian Monsoon compared to the influence of other climatic processes. It is an especially difficult task if the interactions are nonlinear where linear measures like the Pearson correlation coefficient fail. Apart from nonlinearity, auto-dependencies in the processes can lead to misleading high values of coupling strength. There exist statistical methods that address these issues, but most of them assume some model, e.g., a linear model in the case of the partial correlation. We propose a measure based on conditional mutual information that makes no assumptions on the underlying model and is able to exclude auto-dependencies and even influences of external processes. We investigate how the strength measured relates to model systems where a coupling strength is known and discuss its limitations. The measure is applied to time series of different climate indices and gridded data sets to gain insights into the coupling strength between climatic teleconnections. Applied to more than two time series it is also able to shed light on mechanisms of interactions between multiple processes.

  11. Scatter to volume registration for model-free respiratory motion estimation from dynamic MRIs.

    Science.gov (United States)

    Miao, S; Wang, Z J; Pan, L; Butler, J; Moran, G; Liao, R

    2016-09-01

    Respiratory motion is one major complicating factor in many image acquisition applications and image-guided interventions. Existing respiratory motion estimation and compensation methods typically rely on breathing motion models learned from certain training data, and therefore may not be able to effectively handle intra-subject and/or inter-subject variations of respiratory motion. In this paper, we propose a respiratory motion compensation framework that directly recovers motion fields from sparsely spaced and efficiently acquired dynamic 2-D MRIs without using a learned respiratory motion model. We present a scatter-to-volume deformable registration algorithm to register dynamic 2-D MRIs with a static 3-D MRI to recover dense deformation fields. Practical considerations and approximations are provided to solve the scatter-to-volume registration problem efficiently. The performance of the proposed method was investigated on both synthetic and real MRI datasets, and the results showed significant improvements over the state-of-art respiratory motion modeling methods. We also demonstrated a potential application of the proposed method on MRI-based motion corrected PET imaging using hybrid PET/MRI. PMID:27180910

  12. Affine invariant texture analysis based on structural properties

    OpenAIRE

    Zhang, Jianguo; Tan, Tieniu

    2002-01-01

    This paper presents a new texture analysis method based on structural properties. The texture features extracted using this algorithm are invariant to affine transform (including rotation, translation, scaling, and skewing). Affine invariant structural properties are derived based on texel areas. An area-ratio map utilizing these properties is introduced to characterize texture images. Histogram based on this map is constructed for texture classification. Efficiency of this algorithm for affi...

  13. Computer-based modelling and analysis in engineering geology

    OpenAIRE

    Giles, David

    2014-01-01

    This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...

  14. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  15. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  16. Adaptive Fourier Decomposition Based Time-Frequency Analysis

    Institute of Scientific and Technical Information of China (English)

    Li-Ming Zhang

    2014-01-01

    The attempt to represent a signal simultaneously in time and frequency domains is full of challenges. The recently proposed adaptive Fourier decomposition (AFD) offers a practical approach to solve this problem. This paper presents the principles of the AFD based time-frequency analysis in three aspects: instantaneous frequency analysis, frequency spectrum analysis, and the spectrogram analysis. An experiment is conducted and compared with the Fourier transform in convergence rate and short-time Fourier transform in time-frequency distribution. The proposed approach performs better than both the Fourier transform and short-time Fourier transform.

  17. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  18. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  19. Virtual stress amplitude-based low cycle fatigue reliability analysis

    International Nuclear Information System (INIS)

    A method for virtual stress amplitude-based low cycle fatigue reliability analysis is developed. Different from existent methods, probability-based modified Ramberg-Osgood stress-strain relations (P-ε-σ curves) are newly introduced to take into account the scatter of stress-strain responses, where the metallurgical quality of material is not enough good i.e. weld metal to show a same stress-strain response for different specimens under same loading level. In addition, a virtual stress amplitude-based analysis is used to be in agreement with the existent codes for nuclear components. i.e. ASME section III. The analysis is performed by a principle of the stochastic analysis system in same safety level concurrently. Combined the probability-based modified Ramberg-Osgood stress-strain relations, the probability-based Langer S-N curves (P-S-N curves) and the Neuber's local stress-strain rule, the method can be applied to predict the fatigue life at specified reliability and loading history and to estimate the reliability at specified loading history and expectation fatigue life. Applicability of the method has been indicated by a test analysis of 1Cr18Ni9ti steel-weld metal, which was used for machining the pipes of some nuclear reactors, during low cycle fatigue

  20. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  1. A Framework for Web-Based Mechanical Design and Analysis

    Institute of Scientific and Technical Information of China (English)

    Chiaming; Yen; Wujeng; Li

    2002-01-01

    In this paper, a Web-based Mechanical Design and A na lysis Framework (WMDAF) is proposed. This WMADF allows designers to develop web -based computer aided programs in a systematic way during the collaborative mec hanical system design and analysis process. This system is based on an emerg ing web-based Content Management System (CMS) called eXtended Object Oriented P ortal System (XOOPS). Due to the Open Source Status of the XOOPS CMS, programs d eveloped with this framework can be further customized to ...

  2. Tikhonov regularization-based operational transfer path analysis

    Science.gov (United States)

    Cheng, Wei; Lu, Yingying; Zhang, Zhousuo

    2016-06-01

    To overcome ill-posed problems in operational transfer path analysis (OTPA), and improve the stability of solutions, this paper proposes a novel OTPA based on Tikhonov regularization, which considers both fitting degrees and stability of solutions. Firstly, fundamental theory of Tikhonov regularization-based OTPA is presented, and comparative studies are provided to validate the effectiveness on ill-posed problems. Secondly, transfer path analysis and source contribution evaluations for numerical cases studies on spherical radiating acoustical sources are comparatively studied. Finally, transfer path analysis and source contribution evaluations for experimental case studies on a test bed with thin shell structures are provided. This study provides more accurate transfer path analysis for mechanical systems, which can benefit for vibration reduction by structural path optimization. Furthermore, with accurate evaluation of source contributions, vibration monitoring and control by active controlling vibration sources can be effectively carried out.

  3. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  4. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    . A multi-view approach to pose estimation is also presented that integrates low level information from different cameras to generate better pose estimates during heavy occlusions. The works presented in this thesis contribute in these different areas of video-based analysis of human motion and altogether......The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...

  5. Data Warehouse Requirements Analysis Framework: Business-Object Based Approach

    Directory of Open Access Journals (Sweden)

    Anirban Sarkar

    2012-01-01

    Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi

  6. Similar words analysis based on POS-CBOW language model

    Directory of Open Access Journals (Sweden)

    Dongru RUAN

    2015-10-01

    Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.

  7. Applying measurement-based probabilistic timing analysis to buffer resources

    OpenAIRE

    Kosmidis L.; Vardanega T.; Abella J.; Quinones E.; Cazorla F.J.

    2013-01-01

    The use of complex hardware makes it difficult for current timing analysis techniques to compute trustworthy and tight worst-case execution time (WCET) bounds. Those techniques require detailed knowledge of the internal operation and state of the platform, at both the software and hardware level. Obtaining that information for modern hardware platforms is increasingly difficult. Measurement-Based Probabilistic Timing Analysis (MBPTA) reduces the cost of acquiring the knowledge needed for comp...

  8. UML based risk analysis - Application to a medical robot

    OpenAIRE

    Guiochet, Jérémie; Baron, Claude

    2004-01-01

    Medical robots perform complex tasks and share their working area with humans. Therefore , they belong to safety critical systems. In nowadays development process, safety is often managed by the way of dependability techniques. We propose a new global approach , based on the risk concept in order to guide designers along the safety analysis of such complex systems. Safety depends on risk management activity, which core is risk analysis. This one consists in three steps: system definition, haz...

  9. Study of engine noise based on independent component analysis

    Institute of Scientific and Technical Information of China (English)

    HAO Zhi-yong; JIN Yan; YANG Chen

    2007-01-01

    Independent component analysis was applied to analyze the acoustic signals from diesel engine. First the basic principle of independent component analysis (ICA) was reviewed. Diesel engine acoustic signal was decomposed into several independent components (Ics); Fourier transform and continuous wavelet transform (CWT) were applied to analyze the independent components. Different noise sources of the diesel engine were separated, based on the characteristics of different component in time-frequency domain.

  10. FDTD method based electromagnetic solver for printed-circuit analysis

    OpenAIRE

    Gnilenko, Alexey B.; Paliy, Oleg V.

    2003-01-01

    An electromagnetic solver for printed-circuit analysis is presented. The electromagnetic simulator is based on the finite-difference time-domain method with first-order Mur's absorbing boundary conditions. The solver environment comprises a layout graphic editor for circuit topology preparation and a data postprocessor for presenting the calculation results. The solver has been applied to the analysis of printed-circuit components such as printed antenna, microstrip discontinuities, etc.

  11. A Contrastive Analysis on Web-based Intercultural Peer Feedback

    OpenAIRE

    Qin Wang

    2013-01-01

    This paper made a contrastive analysis on peer feedback generated by Chinese EFL learners and Native Speakers of English, who participated in a web-based Cross-Pacific Writing Exchange program. The analysis mainly focused on differences in terms of commenting size, nature and function; the pragmatic differences between two groups of learners were investigated as well. The present study afforded us lessons for peer review training program and provided pedagogical implications for L2 writing.Ke...

  12. Reliable analysis for pressure vessel based on ANSYS

    International Nuclear Information System (INIS)

    With the PDS of ANSYS procedure, the ramdomicity of the actually structure design parameters is simulated, by taking the wall thickness, pressure load and elastic module as input random variables. Based on the reliability analysis of the pressure vessel by Monte-Carlo procedure, the stress probability distribution of this finite element analysis model and the sensitivity of the design parameters such as the pressure load and wall thickness to the stress distribution are obtained. (authors)

  13. iBarcode.org: web-based molecular biodiversity analysis

    OpenAIRE

    Hajibabaei Mehrdad; Singer Gregory AC

    2009-01-01

    Abstract Background DNA sequences have become a primary source of information in biodiversity analysis. For example, short standardized species-specific genomic regions, DNA barcodes, are being used as a global standard for species identification and biodiversity studies. Most DNA barcodes are being generated by laboratories that have an expertise in DNA sequencing but not in bioinformatics data analysis. Therefore, we have developed a web-based suite of tools to help the DNA barcode research...

  14. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  15. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  16. Model-free CPPI

    OpenAIRE

    Alexander Schied

    2013-01-01

    We consider Constant Proportion Portfolio Insurance (CPPI) and its dynamic extension, which may be called Dynamic Proportion Portfolio Insurance (DPPI). It is shown that these investment strategies work within the setting of F\\"ollmer's pathwise It\\^o calculus, which makes no probabilistic assumptions whatsoever. This shows, on the one hand, that CPPI and DPPI are completely independent of any choice of a particular model for the dynamics of asset prices. They even make sense beyond the class...

  17. AN HMM BASED ANALYSIS FRAMEWORK FOR SEMANTIC VIDEO EVENTS

    Institute of Scientific and Technical Information of China (English)

    You Junyong; Liu Guizhong; Zhang Yaxin

    2007-01-01

    Semantic video analysis plays an important role in the field of machine intelligence and pattern recognition. In this paper, based on the Hidden Markov Model (HMM), a semantic recognition framework on compressed videos is proposed to analyze the video events according to six low-level features. After the detailed analysis of video events, the pattern of global motion and five features in foreground--the principal parts of videos, are employed as the observations of the Hidden Markov Model to classify events in videos. The applications of the proposed framework in some video event detections demonstrate the promising success of the proposed framework on semantic video analysis.

  18. RULE-BASED SENTIMENT ANALYSIS OF UKRAINIAN REVIEWS

    Directory of Open Access Journals (Sweden)

    Mariana Romanyshyn

    2013-07-01

    Full Text Available Last decade witnessed a lot of research in the field of sentiment analysis. Understanding the attitude and the emotions that people express in written text proved to be really important and helpful in sociology, political science, psychology, market research, and, of course, artificial intelligence. This paper demonstrates a rule-based approach to clause-level sentiment analysis of reviews in Ukrainian. The general architecture of the implemented sentiment analysis system is presented, the current stage of research is described and further work is explained. The main emphasis is made on the design of rules for computing sentiments.

  19. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  20. A New Customer Segmentation Framework Based on Biclustering Analysis

    OpenAIRE

    Xiaohui Hu; Haolan Zhang; Xiaosheng Wu; Jianlin Chen; Yu Xiao; Yun Xue; Tiechen Li; Hongya Zhao

    2014-01-01

    The paper presents a novel approach for customer segmentation which is the basic issue for an effective CRM ( Customer Relationship Management ). Firstly, the chi-square statistical analysis is applied to choose set of attributes and K-means algorithm is employed to quantize the value of each attribute. Then DBSCAN algorithm based on density is introduced to classify the customers into three groups (the first, the second and the third class). Finally biclustering based on improved Apriori alg...

  1. Adaptive Human aware Navigation based on Motion Pattern Analysis

    DEFF Research Database (Denmark)

    Tranberg, Søren; Svenstrup, Mikael; Andersen, Hans Jørgen;

    2009-01-01

    are based on run-time motion pattern analysis compared to stored experience in a database. Using a potential field centered around the person, the robot positions itself at the most appropriate place relative to the person and the interaction status. The system is validated through qualitative tests...... in a real world setting. The results demonstrate that the system is able to learn to navigate based on past interaction experiences, and to adapt to different behaviors over time....

  2. AUTHENTICITY IN TASK-BASED INTERACTION: A CONVERSATION ANALYSIS PERSPECTIVE

    OpenAIRE

    HANAN WAER

    2009-01-01

    In recent years, there has been an increasing interest in task-based learning. Authenticity has been characterized as a main aspect in defining a task (Long 1985; Skehan 1996; Ellis 2003). However, far too little attention has been paid to investigating authenticity in task-based interaction (TBI). To the best knowledge of the researcher, no research has been done using conversation analysis (CA) to investigate authenticity in TBI. Therefore, the present paper focuses on the issue of authent...

  3. Performance monitoring of MPC based on dynamic principal component analysis

    OpenAIRE

    Tian, Xuemin; Chen, Gongquan; Cao, YuPing; Chen, Sheng

    2011-01-01

    A unified framework based on the dynamic principal component analysis (PCA) is proposed for performance monitoring of constrained multi-variable model predictive control (MPC) systems. In the proposed performance monitoring framework, the dynamic PCA based performance benchmark is adopted for performance assessment, while performance diagnosis is carried out using a unified weighted dynamic PCA similarity measure. Simulation results obtained from the case study of the Shell process demonstrat...

  4. Sentiment analysis framework organization based on twitter corpus data

    OpenAIRE

    Adela Beres

    2012-01-01

    Since its inception in 2006, Twitter has gathered millions of users. They post daily tweets about news, events or conversations. These tweets express their opinion about the topic they are discussing. Twitter is a large database of content that can be semantically exploited to extract opinions and based on these opinions to classify the users. This paper presents the organization of a sentiment analysis framework based on Twitter corpus data, including crawling tweets and opinion mining of th...

  5. Semiparametric theory based MIMO model and performance analysis

    Institute of Scientific and Technical Information of China (English)

    XU Fang-min; XU Xiao-dong; ZHANG Ping

    2007-01-01

    In this article, a new approach for modeling multi- input multi-output (MIMO) systems with unknown nonlinear interference is introduced. The semiparametric theory based MIMO model is established, and Kernel estimation is applied to combat the nonlinear interference. Furthermore, we derive MIMO capacity for these systems and explore the asymptotic properties of the new channel matrix via theoretical analysis. The simulation results show that the semiparametric theory based modeling and kernel estimation are valid to combat this kind of interference.

  6. Dependence Analysis of Component Based Software through Assumptions

    OpenAIRE

    Ratneshwer; Tripathi, A. K.

    2011-01-01

    This study presents a quantitative approach for dependency analysis of Component Based Software (CBS) systems. Various types of dependency, in a CBS, have been observed through 'assumptions' and based on these observations some derived dependency relationships are proposed. The proposed dependency relationships are validated theoretically and an example illustration has been shown to demonstrate the proposal. The result of the study suggests that these dependency relationships may prove helpf...

  7. Nucleonica: Web-based Software Tools for Simulations and Analysis

    OpenAIRE

    Magill, Joseph; DREHER Raymond; SOTI Zsolt; LASCHE George

    2012-01-01

    The authors present a description of a new web-based software portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data res...

  8. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  9. Crawl-Based Analysis of Web Applications: Prospects and Challenges

    OpenAIRE

    Van Deursen, A.; Mesbah, A.; Nederlof, A.

    2014-01-01

    In this paper we review five years of research in the field of automated crawling and testing of web applications. We describe the open source Crawljax tool, and the various extensions that have been proposed in order to address such issues as cross-browser compatibility testing, web application regression testing, and style sheet usage analysis. Based on that we identify the main challenges and future directions of crawl-based testing of web applications. In particular, we explore ways to re...

  10. Dependence Analysis of Component Based Software through Assumptions

    Directory of Open Access Journals (Sweden)

    Ratneshwer

    2011-07-01

    Full Text Available This study presents a quantitative approach for dependency analysis of Component Based Software (CBS systems. Various types of dependency, in a CBS, have been observed through 'assumptions' and based on these observations some derived dependency relationships are proposed. The proposed dependency relationships are validated theoretically and an example illustration has been shown to demonstrate the proposal. The result of the study suggests that these dependency relationships may prove helpful in understanding CBS systems.

  11. The analysis of Al-based alloys by calorimetry: quantitative analysis of reactions and reaction kinetics

    OpenAIRE

    Starink, M.J.

    2004-01-01

    Differential scanning calorimetry (DSC) and isothermal calorimetry have been applied extensively to the analysis of light metals, especially Al based alloys. Isothermal calorimetry and differential scanning calorimetry are used for analysis of solid state reactions, such as precipitation, homogenisation, devitrivication and recrystallisation; and solid–liquid reactions, such as incipient melting and solidification, are studied by differential scanning calorimetry. In producing repeatable calo...

  12. A model-free temperature-dependent conformational study of n-pentane in nematic liquid crystals

    Science.gov (United States)

    Burnell, E. Elliott; Weber, Adrian C. J.; Dong, Ronald Y.; Meerts, W. Leo; de Lange, Cornelis A.

    2015-01-01

    The proton NMR spectra of n-pentane orientationally ordered in two nematic liquid-crystal solvents are studied over a wide temperature range and analysed using covariance matrix adaptation evolutionary strategy. Since alkanes possess small electrostatic moments, their anisotropic intermolecular interactions are dominated by short-range size-and-shape effects. As we assumed for n-butane, the anisotropic energy parameters of each n-pentane conformer are taken to be proportional to those of ethane and propane, independent of temperature. The observed temperature dependence of the n-pentane dipolar couplings allows a model-free separation between conformer degrees of order and conformer probabilities, which cannot be achieved at a single temperature. In this way for n-pentane 13 anisotropic energy parameters (two for trans trans, tt, five for trans gauche, tg, and three for each of gauche+ gauche+, pp, and gauche+ gauche-, pm), the isotropic trans-gauche energy difference Etg and its temperature coefficient Etg ' are obtained. The value obtained for the extra energy associated with the proximity of the two methyl groups in the gauche+ gauche- conformers (the pentane effect) is sensitive to minute details of other assumptions and is thus fixed in the calculations. Conformer populations are affected by the environment. In particular, anisotropic interactions increase the trans probability in the ordered phase.

  13. Nuclear power company activity based costing management analysis

    International Nuclear Information System (INIS)

    With Nuclear Energy Industry development, Nuclear Power Company has the continual promoting stress of inner management to the sustainable marketing operation development. In view of this, it is very imminence that Nuclear Power Company should promote the cost management levels and built the nuclear safety based lower cost competitive advantage. Activity based costing management (ABCM) transfer the cost management emphases from the 'product' to the 'activity' using the value chain analysis methods, cost driver analysis methods and so on. According to the analysis of the detail activities and the value chains, cancel the unnecessary activity, low down the resource consuming of the necessary activity, and manage the cost from the source, achieve the purpose of reducing cost, boosting efficiency and realizing the management value. It gets the conclusion from the detail analysis with the nuclear power company procedure and activity, and also with the selection to 'pieces analysis' of the important cost related project in the nuclear power company. The conclusion is that the activities of the nuclear power company has the obviously performance. It can use the management of ABC method. And with the management of the procedure and activity, it is helpful to realize the nuclear safety based low cost competitive advantage in the nuclear power company. (author)

  14. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  15. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Franqueira, Virginia N.L.; Tun, Thein Tan; Wieringa, Roel J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about atta

  16. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  17. Teaching of Editorial Writing Uses Claims-Based Analysis.

    Science.gov (United States)

    Porter, William C.

    1989-01-01

    Urges the use of claims-based analysis in editorial writing instruction. Explains the use of five hierarchical claim types (factual, definitional, causal, value, and policy) to teach students to analyze and formulate arguments, thus teaching editorial writing by focusing more on the process than on the product. (SR)

  18. Spironolactone use and renal toxicity: population based longitudinal analysis.

    OpenAIRE

    Wei, L; Struthers, A D; Fahey, T; Watson, A D; MacDonald, T. M.

    2010-01-01

    Objective To determine the safety of spironolactone prescribing in the setting of the UK National Health Service. Design Population based longitudinal analysis using a record linkage database. Setting Tayside, Scotland. Population All patients who received one or more dispensed prescriptions for spironolactone between 1994 and 2007. Main outcome measures Rates of prescribing for spironolactone, hospital admissions for hyperkalaemia, and hyperkalaemia and renal function without...

  19. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  20. A Corpus-based Analysis of English Noun Suffixes

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper provides a brief analysis of English suffixes. First, make a classification of the English noun suffixes etymologically; then, obtain the frequencies of each English noun suffixes in sub-corpus FR88 and WSJ88, and last draw a conclusion based on the statistics. That is from the word origins we can see its influences on English vocabulary.

  1. Frailty phenotypes in the elderly based on cluster analysis

    DEFF Research Database (Denmark)

    Dato, Serena; Montesanto, Alberto; Lagani, Vincenzo;

    2012-01-01

    genetic background on the frailty status is still questioned. We investigated the applicability of a cluster analysis approach based on specific geriatric parameters, previously set up and validated in a southern Italian population, to two large longitudinal Danish samples. In both cohorts, we identified...

  2. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  3. Project-Based Language Learning: An Activity Theory Analysis

    Science.gov (United States)

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  4. LES based POD analysis of Jet in Cross Flow

    DEFF Research Database (Denmark)

    Cavar, Dalibor; Meyer, Knud Erik; Jakirlic, S.;

    2010-01-01

    The paper presents results of a POD investigation of the LES based numerical simulation of the jet-in-crossflow (JICF) flowfield. LES results are firstly compared to the pointwise LDA measurements. 2D POD analysis is then used as a comparison basis for PIV measurements and LES, and finally 3D POD...

  5. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    Science.gov (United States)

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  6. Utilizing Problem-Based Learning in Qualitative Analysis Lab Experiments

    Science.gov (United States)

    Hicks, Randall W.; Bevsek, Holly M.

    2012-01-01

    A series of qualitative analysis (QA) laboratory experiments utilizing a problem-based learning (PBL) module has been designed and implemented. The module guided students through the experiments under the guise of cleaning up a potentially contaminated water site as employees of an environmental chemistry laboratory. The main goal was the…

  7. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  8. Teaching-Learning Activity Modeling Based on Data Analysis

    Directory of Open Access Journals (Sweden)

    Kyungrog Kim

    2015-03-01

    Full Text Available Numerous studies are currently being carried out on personalized services based on data analysis to find and provide valuable information about information overload. Furthermore, the number of studies on data analysis of teaching-learning activities for personalized services in the field of teaching-learning is increasing, too. This paper proposes a learning style recency-frequency-durability (LS-RFD model for quantified analysis on the level of activities of learners, to provide the elements of teaching-learning activities according to the learning style of the learner among various parameters for personalized service. This is to measure preferences as to teaching-learning activity according to recency, frequency and durability of such activities. Based on the results, user characteristics can be classified into groups for teaching-learning activity by categorizing the level of preference and activity of the learner.

  9. Unified HMM-based layout analysis framework and algorithm

    Institute of Scientific and Technical Information of China (English)

    陈明; 丁晓青; 吴佑寿

    2003-01-01

    To manipulate the layout analysis problem for complex or irregular document image, a Unified HMM-based Layout Analysis Framework is presented in this paper. Based on the multi-resolution wavelet analysis results of the document image, we use HMM method in both inner-scale image model and trans-scale context model to classify the pixel region properties, such as text, picture or background. In each scale, a HMM direct segmentation method is used to get better inner-scale classification result. Then another HMM method is used to fuse the inner-scale result in each scale and then get better final seg- mentation result. The optimized algorithm uses a stop rule in the coarse to fine multi-scale segmentation process, so the speed is improved remarkably. Experiments prove the efficiency of proposed algorithm.

  10. Haplotype-Based Analysis: A Summary of GAW16 Group 4 Analysis

    OpenAIRE

    Hauser, Elizabeth; Cremer, Nadine; Hein, Rebecca; Deshmukh, Harshal

    2009-01-01

    In this summary paper, we describe the contributions included in the haplotype-based analysis group (Group 4) at the Genetic Analysis Workshop 16, which was held September 17-20, 2008. Our group applied a large number of haplotype-based methods in the context of genome-wide association studies. Two general approaches were applied: a two-stage approach that selected significant single-nucleotide polymorphisms and then created haplotypes and genome-wide analysis of smaller sets of single-nucleo...

  11. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    Science.gov (United States)

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  12. Cortical surface-based statistical analysis of brain PET images

    International Nuclear Information System (INIS)

    Precise and focal analysis of brain PET using voxel-based statistical mapping is limited due to the innate low spatial resolution of PET images which causes partial volume effect as well as due to the low precision of the image registration. In this study, we propose a cortical surface-based method for the precise analysis of brain PET images in combination with MRI. 18F-FDG brain PET images were acquired using GE ADVANCE PET scanner in 3D mode. 3D T1-weighted axial MR images were acquired from Philips Intera 1.5T scanner with slice thickness 1.5 mm and FOV=22 cm. The first step of analysis, we segmented gray and white matter from the structural T1 images using Freesurfer (MGH, Harvard Medical School) which extract the white matter surface using a deformable surface model. The cortical surface was further parcellated automatically into 85 anatomically relevant brain sub-regions. The second step, we developed a method for registering PET images to MRI in combination with a mutual information algorithm to maximize total metabolic activity within the gray matter band. Partial volume correction of PET image was conducted utilizing the extracted gray matter. The third step, we calculated mean cortical activity along the path from the white matter surface to the gray matter surface. The cortical activity was represented on the spatially normalized surface which statistical evaluation of cortical activity was conducted with. We evaluated the surface-based representation of PET images and the registration of PET images and the registration of PET and MRI utilizing cortical parcellation. The preliminary results showed that our method is very promising in the analysis of subtle cortical activity difference. We proposed a novel surface-based approach of brain PET analysis using high resolution MRI. Cortical Surface-based method was very efficient in the precise representation of brain activity, correction of partial volume effect as well as better spatial normalization

  13. Cortical surface-based statistical analysis of brain PET images

    Energy Technology Data Exchange (ETDEWEB)

    Park, Hae Jeong; Kim, Jae Jin; Yoon, Mi Jin; Yoo, Young Hoon; Lee, Jong Doo [School of Medicine, Yonsei University, Seoul (Korea, Republic of)

    2004-07-01

    Precise and focal analysis of brain PET using voxel-based statistical mapping is limited due to the innate low spatial resolution of PET images which causes partial volume effect as well as due to the low precision of the image registration. In this study, we propose a cortical surface-based method for the precise analysis of brain PET images in combination with MRI. {sup 18}F-FDG brain PET images were acquired using GE ADVANCE PET scanner in 3D mode. 3D T1-weighted axial MR images were acquired from Philips Intera 1.5T scanner with slice thickness 1.5 mm and FOV=22 cm. The first step of analysis, we segmented gray and white matter from the structural T1 images using Freesurfer (MGH, Harvard Medical School) which extract the white matter surface using a deformable surface model. The cortical surface was further parcellated automatically into 85 anatomically relevant brain sub-regions. The second step, we developed a method for registering PET images to MRI in combination with a mutual information algorithm to maximize total metabolic activity within the gray matter band. Partial volume correction of PET image was conducted utilizing the extracted gray matter. The third step, we calculated mean cortical activity along the path from the white matter surface to the gray matter surface. The cortical activity was represented on the spatially normalized surface which statistical evaluation of cortical activity was conducted with. We evaluated the surface-based representation of PET images and the registration of PET images and the registration of PET and MRI utilizing cortical parcellation. The preliminary results showed that our method is very promising in the analysis of subtle cortical activity difference. We proposed a novel surface-based approach of brain PET analysis using high resolution MRI. Cortical Surface-based method was very efficient in the precise representation of brain activity, correction of partial volume effect as well as better spatial normalization.

  14. Protein expression based multimarker analysis of breast cancer samples

    Directory of Open Access Journals (Sweden)

    Rajasekaran Ayyappan K

    2011-06-01

    Full Text Available Abstract Background Tissue microarray (TMA data are commonly used to validate the prognostic accuracy of tumor markers. For example, breast cancer TMA data have led to the identification of several promising prognostic markers of survival time. Several studies have shown that TMA data can also be used to cluster patients into clinically distinct groups. Here we use breast cancer TMA data to cluster patients into distinct prognostic groups. Methods We apply weighted correlation network analysis (WGCNA to TMA data consisting of 26 putative tumor biomarkers measured on 82 breast cancer patients. Based on this analysis we identify three groups of patients with low (5.4%, moderate (22% and high (50% mortality rates, respectively. We then develop a simple threshold rule using a subset of three markers (p53, Na-KATPase-β1, and TGF β receptor II that can approximately define these mortality groups. We compare the results of this correlation network analysis with results from a standard Cox regression analysis. Results We find that the rule-based grouping variable (referred to as WGCNA* is an independent predictor of survival time. While WGCNA* is based on protein measurements (TMA data, it validated in two independent Affymetrix microarray gene expression data (which measure mRNA abundance. We find that the WGCNA patient groups differed by 35% from mortality groups defined by a more conventional stepwise Cox regression analysis approach. Conclusions We show that correlation network methods, which are primarily used to analyze the relationships between gene products, are also useful for analyzing the relationships between patients and for defining distinct patient groups based on TMA data. We identify a rule based on three tumor markers for predicting breast cancer survival outcomes.

  15. Protein expression based multimarker analysis of breast cancer samples

    International Nuclear Information System (INIS)

    Tissue microarray (TMA) data are commonly used to validate the prognostic accuracy of tumor markers. For example, breast cancer TMA data have led to the identification of several promising prognostic markers of survival time. Several studies have shown that TMA data can also be used to cluster patients into clinically distinct groups. Here we use breast cancer TMA data to cluster patients into distinct prognostic groups. We apply weighted correlation network analysis (WGCNA) to TMA data consisting of 26 putative tumor biomarkers measured on 82 breast cancer patients. Based on this analysis we identify three groups of patients with low (5.4%), moderate (22%) and high (50%) mortality rates, respectively. We then develop a simple threshold rule using a subset of three markers (p53, Na-KATPase-β1, and TGF β receptor II) that can approximately define these mortality groups. We compare the results of this correlation network analysis with results from a standard Cox regression analysis. We find that the rule-based grouping variable (referred to as WGCNA*) is an independent predictor of survival time. While WGCNA* is based on protein measurements (TMA data), it validated in two independent Affymetrix microarray gene expression data (which measure mRNA abundance). We find that the WGCNA patient groups differed by 35% from mortality groups defined by a more conventional stepwise Cox regression analysis approach. We show that correlation network methods, which are primarily used to analyze the relationships between gene products, are also useful for analyzing the relationships between patients and for defining distinct patient groups based on TMA data. We identify a rule based on three tumor markers for predicting breast cancer survival outcomes

  16. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  17. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  18. Research on supplier evaluation and selection based on fuzzy hierarchy analysis and grey relational analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Considering the disadvantages of selecting evaluation index of supplier based on old purchase relation and in view of transformation of relation between manufacture and supplier under the dynamic, cooperative, competitive and quickly response environment, research on supplier selection evaluation was presented based on enterprise capability, cooperation degree and service level from the perspective of cooperative partnership and coordination, and the evaluation index system was established. A more objective and veracious supplier selection and evaluation method based on fuzzy analysis hierarchy process and grey relational analysis was developed, and then empirical research on electric equipment manufacturer was explored to analyze the supplier selection and evaluation.

  19. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    OpenAIRE

    Jingzheng Ren; Alessandro Manzardo; Anna Mazzi; Andrea Fedele; Antonio Scipioni

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as t...

  20. Improved reliability analysis method based on the failure assessment diagram

    Science.gov (United States)

    Zhou, Yu; Zhang, Zheng; Zhong, Qunpeng

    2012-07-01

    With the uncertainties related to operating conditions, in-service non-destructive testing (NDT) measurements and material properties considered in the structural integrity assessment, probabilistic analysis based on the failure assessment diagram (FAD) approach has recently become an important concern. However, the point density revealing the probabilistic distribution characteristics of the assessment points is usually ignored. To obtain more detailed and direct knowledge from the reliability analysis, an improved probabilistic fracture mechanics (PFM) assessment method is proposed. By integrating 2D kernel density estimation (KDE) technology into the traditional probabilistic assessment, the probabilistic density of the randomly distributed assessment points is visualized in the assessment diagram. Moreover, a modified interval sensitivity analysis is implemented and compared with probabilistic sensitivity analysis. The improved reliability analysis method is applied to the assessment of a high pressure pipe containing an axial internal semi-elliptical surface crack. The results indicate that these two methods can give consistent sensitivities of input parameters, but the interval sensitivity analysis is computationally more efficient. Meanwhile, the point density distribution and its contour are plotted in the FAD, thereby better revealing the characteristics of PFM assessment. This study provides a powerful tool for the reliability analysis of critical structures.

  1. Graph-based layout analysis for PDF documents

    Science.gov (United States)

    Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao

    2013-03-01

    To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.

  2. ROOT based Offline and Online Analysis (ROAn): An analysis framework for X-ray detector data

    International Nuclear Information System (INIS)

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and only the necessary recalculations are triggered. This saves time and simultaneously keeps the results consistent. The viewer program offers a configurable Graphical User Interface (GUI) and process chain, which allows the user to adapt the program to different tasks such as offline viewing of file data, online monitoring of running detector systems, or performing online data analysis (histogramming, calibration, etc.). Because of its modular design, ROAn can be extended easily, e.g. be adapted to new detector types and analysis processes

  3. ROOT based Offline and Online Analysis (ROAn): An analysis framework for X-ray detector data

    Science.gov (United States)

    Lauf, Thomas; Andritschke, Robert

    2014-10-01

    The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and only the necessary recalculations are triggered. This saves time and simultaneously keeps the results consistent. The viewer program offers a configurable Graphical User Interface (GUI) and process chain, which allows the user to adapt the program to different tasks such as offline viewing of file data, online monitoring of running detector systems, or performing online data analysis (histogramming, calibration, etc.). Because of its modular design, ROAn can be extended easily, e.g. be adapted to new detector types and analysis processes.

  4. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  5. Barcode Server: A Visualization-Based Genome Analysis System

    Science.gov (United States)

    Mao, Fenglou; Olman, Victor; Wang, Yan; Xu, Ying

    2013-01-01

    We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a) identification of horizontally transferred genes, (b) identification of genomic islands with special properties and (c) binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a) calculation of the k-mer based barcode image for a provided DNA sequence; (b) detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c) clustering of provided DNA sequences into groups having similar barcodes; and (d) homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode. PMID:23457606

  6. Statistical analysis in dBASE-compatible databases.

    Science.gov (United States)

    Hauer-Jensen, M

    1991-01-01

    Database management in clinical and experimental research often requires statistical analysis of the data in addition to the usual functions for storing, organizing, manipulating and reporting. With most database systems, transfer of data to a dedicated statistics package is a relatively simple task. However, many statistics programs lack the powerful features found in database management software. dBASE IV and compatible programs are currently among the most widely used database management programs. d4STAT is a utility program for dBASE, containing a collection of statistical functions and tests for data stored in the dBASE file format. By using d4STAT, statistical calculations may be performed directly on the data stored in the database without having to exit dBASE IV or export data. Record selection and variable transformations are performed in memory, thus obviating the need for creating new variables or data files. The current version of the program contains routines for descriptive statistics, paired and unpaired t-tests, correlation, linear regression, frequency tables, Mann-Whitney U-test, Wilcoxon signed rank test, a time-saving procedure for counting observations according to user specified selection criteria, survival analysis (product limit estimate analysis, log-rank test, and graphics), and normal t and chi-squared distribution functions. PMID:2004275

  7. Risk-based planning analysis for a single levee

    Science.gov (United States)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  8. Multilevel Solvers with Aggregations for Voxel Based Analysis of Geomaterials

    Czech Academy of Sciences Publication Activity Database

    Blaheta, Radim; Sokol, V.

    Berlin, Heidelberg : Springer-Verlag, 2012, -, č. 7116 /2012/, s. 489-497. ISBN 978-3-642-29842-4. ISSN 0302-9743. [LSSC 2011. Sozopol (BG), 06.06.2011-10.06.2011] R&D Projects: GA ČR GA105/09/1830 Grant ostatní: GA ČR(CZ) GD103/09/H078 Institutional research plan: CEZ:AV0Z30860518 Keywords : voxel based analysis * finite element analysis * tomography Subject RIV: IN - Informatics, Computer Science http://www.springer.com/series/558?changeHeader

  9. FARO base case post-test analysis by COMETA code

    International Nuclear Information System (INIS)

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase

  10. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    Science.gov (United States)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  11. Climatology of Mexico: a Description Based on Clustering Analysis

    Science.gov (United States)

    Pineda-Martinez, L. F.; Carbajal, N.

    2007-05-01

    Climate regions of Mexico are delimitated using hierarchical clustering analysis (HCA). We assign the variables, precipitation and temperature, to groups or clusters based on similar statistical characteristics. Since meteorological stations in Mexico expose a heterogonous geographic distribution, we used principal components analysis (PCA) to obtain a standardized reduced matrix to apply conveniently HCA. We consider monthly means of maxima and minima temperature and monthly accumulated precipitation from a meteorological dataset of the National Water Commission of Mexico. It allows defining groups of station delimiting regions of similar climate. It also allows describing the regional effect of events such as the Mexican monsoon and ENSO.

  12. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation of...... mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size) on...

  13. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  14. TTCScope - A scope-based TTC analysis tool

    CERN Document Server

    Moosavi, P

    2013-01-01

    This document describes a scope-based analysis tool for the TTC system. The software, ttcscope, was designed to sample the encoded TTC signal from a TTCex module with the aid of a LeCroy WaveRunner oscilloscope. From the sampled signal, the bunch crossing clock is recovered, along with the signal contents: level-1 accepts, TTC commands, and trigger types. Two use-cases are addressed: analysis of TTC signals and calibration of TTC crates. The latter includes calibration schemes for two signal phase shifts, one related to level-1 accepts, and the other to TTC commands.

  15. Towards Performance Measurement And Metrics Based Analysis of PLA Applications

    CERN Document Server

    Ahmed, Zeeshan

    2010-01-01

    This article is about a measurement analysis based approach to help software practitioners in managing the additional level complexities and variabilities in software product line applications. The architecture of the proposed approach i.e. ZAC is designed and implemented to perform preprocessesed source code analysis, calculate traditional and product line metrics and visualize results in two and three dimensional diagrams. Experiments using real time data sets are performed which concluded with the results that the ZAC can be very helpful for the software practitioners in understanding the overall structure and complexity of product line applications. Moreover the obtained results prove strong positive correlation between calculated traditional and product line measures.

  16. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Science.gov (United States)

    Liang, Jianming; Järvi, Timo; Kiuru, Aaro; Kormano, Martti; Svedström, Erkki

    2003-12-01

    The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT) and nuclear medicine (NM) studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  17. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  18. Job optimization in ATLAS TAG-based distributed analysis

    International Nuclear Information System (INIS)

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ('skimming', 'slimming' and 'thinning') as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  19. Image registration based on matrix perturbation analysis using spectral graph

    Institute of Scientific and Technical Information of China (English)

    Chengcai Leng; Zheng Tian; Jing Li; Mingtao Ding

    2009-01-01

    @@ We present a novel perspective on characterizing the spectral correspondence between nodes of the weighted graph with application to image registration.It is based on matrix perturbation analysis on the spectral graph.The contribution may be divided into three parts.Firstly, the perturbation matrix is obtained by perturbing the matrix of graph model.Secondly, an orthogonal matrix is obtained based on an optimal parameter, which can better capture correspondence features.Thirdly, the optimal matching matrix is proposed by adjusting signs of orthogonal matrix for image registration.Experiments on both synthetic images and real-world images demonstrate the effectiveness and accuracy of the proposed method.

  20. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  1. Protein analysis based on molecular beacon probes and biofunctionalized nanoparticles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    With the completion of the human genome-sequencing project, there has been a resulting change in the focus of studies from genomics to proteomics. By utilizing the inherent advantages of molecular beacon probes and biofunctionalized nanoparticles, a series of novel principles, methods and techniques have been exploited for bioanalytical and biomedical studies. This review mainly discusses the applications of molecular beacon probes and biofunctionalized nanoparticles-based technologies for realtime, in-situ, highly sensitive and highly selective protein analysis, including the nonspecific or specific protein detection and separation, protein/DNA interaction studies, cell surface protein recognition, and antigen-antibody binding process-based bacteria assays. The introduction of molecular beacon probes and biofunctionalized nanoparticles into the protein analysis area would necessarily advance the proteomics research.

  2. Classification analysis of microarray data based on ontological engineering

    Institute of Scientific and Technical Information of China (English)

    LI Guo-qi; SHENG Huan-ye

    2007-01-01

    Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to provide background knowledge to direct the process of data mining. This paper gives a common introduction to the method and presents a practical analysis example using SVM (support vector machine) as the classifier. Gene Ontology and the accompanying annotations compose a big knowledge base, on which many researches have been carried out. Microarray dataset is the output of DNA chip.With the help of Gene Ontology we present a more elaborate analysis on microarray data than former researchers. The method can also be used in other fields with similar scenario.

  3. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  4. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  5. Gold Nanoparticles-Based Barcode Analysis for Detection of Norepinephrine.

    Science.gov (United States)

    An, Jeung Hee; Lee, Kwon-Jai; Choi, Jeong-Woo

    2016-02-01

    Nanotechnology-based bio-barcode amplification analysis offers an innovative approach for detecting neurotransmitters. We evaluated the efficacy of this method for detecting norepinephrine in normal and oxidative-stress damaged dopaminergic cells. Our approach use a combination of DNA barcodes and bead-based immunoassays for detecting neurotransmitters with surface-enhanced Raman spectroscopy (SERS), and provides polymerase chain reaction (PCR)-like sensitivity. This method relies on magnetic Dynabeads containing antibodies and nanoparticles that are loaded both with DNA barcords and with antibodies that can sandwich the target protein captured by the Dynabead-bound antibodies. The aggregate sandwich structures are magnetically separated from the solution and treated to remove the conjugated barcode DNA. The DNA barcodes are then identified by SERS and PCR analysis. The concentration of norepinephrine in dopaminergic cells can be readily detected using the bio-barcode assay, which is a rapid, high-throughput screening tool for detecting neurotransmitters. PMID:27305769

  6. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  7. CORBA-Based Analysis of Multi Agent Behavior

    Institute of Scientific and Technical Information of China (English)

    Swapan Bhattacharya; Anirban Banerjee; Shibdas Bandyopadhyay

    2005-01-01

    An agent is a computer software that is capable of taking independent action on behalf of its user or owner. It is an entity with goals, actions and domain knowledge, situated in an environment. Multiagent systems comprises of multiple autonomous, interacting computer software, or agents. These systems can successfully emulate the entities active in a distributed environment. The analysis of multiagent behavior has been studied in this paper based on a specific board game problem similar to the famous problem of GO. In this paper a framework is developed to define the states of the multiagent entities and measure the convergence metrics for this problem. An analysis of the changes of states leading to the goal state is also made. We support our study of multiagent behavior by simulations based on a CORBA framework in order to substantiate our findings.

  8. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  9. Student Engagement: A Principle-Based Concept Analysis.

    Science.gov (United States)

    Bernard, Jean S

    2015-01-01

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline. PMID:26234950

  10. Chromatin structure analysis based on a hierarchic texture model.

    Science.gov (United States)

    Wolf, G; Beil, M; Guski, H

    1995-02-01

    The quantification of chromatin structures is an important part of nuclear grading of malignant and premalignant lesions. In order to achieve high accuracy, computerized image analysis systems have been applied in this process. Chromatin texture analysis of cell nuclei requires a suitable texture model. A hierarchic model seemed to be most compatible for this purpose. It assumes that texture consists of homogeneous regions (textons). Based on this model, two approaches to texture segmentation and feature extraction were investigated using sections of cervical tissue. We examined the reproducibility of the measurement under changing optical conditions. The coefficients of variations of the texture features ranged from 2.1% to 16.9%. The features were tested for their discriminating capability in a pilot study including 30 cases of cervical dysplasia and carcinoma. The overall classification accuracy reached 65%. This study presents an automated technique for texture analysis that is similar to human perception. PMID:7766266

  11. Error Analysis of Robotic Assembly System Based on Screw Theory

    Institute of Scientific and Technical Information of China (English)

    韩卫军; 费燕琼; 赵锡芳

    2003-01-01

    Assembly errors have great influence on assembly quality in robotic assembly systems. Error analysis is directed to the propagations and accumula-tions of various errors and their effect on assembly success.Using the screw coordinates, assembly errors are represented as "error twist", the extremely compact expression. According to the law of screw composition, relative position and orientation errors of mating parts are computed and the necessary condition of assembly success is concluded. A new simple method for measuring assembly errors is also proposed based on the transformation law of a screw.Because of the compact representation of error, the model presented for error analysis can be applied to various part- mating types and especially useful for error analysis of complexity assembly.

  12. Tariff-based analysis of commercial building electricityprices

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie M.; Bolduc, Chris A.; Rosenquist, Greg J.; VanBuskirk, Robert D.; McMahon, James E.

    2008-03-28

    This paper presents the results of a survey and analysis ofelectricity tariffs and marginal electricity prices for commercialbuildings. The tariff data come from a survey of 90 utilities and 250tariffs for non-residential customers collected in 2004 as part of theTariff Analysis Project at LBNL. The goals of this analysis are toprovide useful summary data on the marginal electricity prices commercialcustomers actually see, and insight into the factors that are mostimportant in determining prices under different circumstances. We providea new, empirically-based definition of several marginal prices: theeffective marginal price and energy-only anddemand-only prices, andderive a simple formula that expresses the dependence of the effectivemarginal price on the marginal load factor. The latter is a variable thatcan be used to characterize the load impacts of a particular end-use orefficiency measure. We calculate all these prices for eleven regionswithin the continental U.S.

  13. SVM-based glioma grading. Optimization by feature reduction analysis

    International Nuclear Information System (INIS)

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∝87%) while reducing the number of features by up to 98%. (orig.)

  14. Analysis Of Japans Economy Based On 2014 From Macroeconomics Prospects

    Directory of Open Access Journals (Sweden)

    Dr Mohammad Rafiqul Islam

    2015-02-01

    Full Text Available Abstract Japan is the worlds third largest economy. But currently economic situations of Japan are not stable. It is not increasing as expected. Since 2013 it was world second largest economy but Japan loosed its placed to China in 2014 due to slow growth of important economic indicators. By using the basic Keynesian model we will provide a detailed analysis of the short and long run impacts of the changes for Japans real GDP rate of unemployment and inflation rate. We demonstrated a detailed use of the 45-degree diagram or the AD-IA model and other economic analysis of the macroeconomic principles that underlie the model and concepts. Finally we will recommend the government with a change in fiscal policy what based on the analysis by considering what might be achieved with a fiscal policy response and the extent to which any impact on the stock of public debt might be a consideration

  15. IBM-PC based reactor neutronics analysis package

    International Nuclear Information System (INIS)

    The development of a comprehensive system of microcomputer-based codes suitable for neutronics and shielding analysis of nuclear reactors has been undertaken by EG and G Idaho, Inc. at the Idaho National Engineering Laboratory (INEL). This system has been designed for cross section generation, one-dimensional discrete-ordinates analysis, one- two- and three-dimensional diffusion theory analysis, and various other radiation transport applications of interest. Several code modules are now operational, others are still under development. Use of desktop microcomputers rather than mainframe systems for complex scientific calculations offers several distinct advantages. These include economy, user convenience, and local, decentralized control of calculations. In addition to INEL applications, this code system could be extremely useful outside of the National Laboratory environment where access to appropriate mainframe computing systems may be limited. Outside users may include universities and some utilities and consultant organizations

  16. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  17. Method of narcissus analysis in infrared system based on ASAP

    Science.gov (United States)

    Ren, Guodong; Zhang, Liang; Lan, Weihua; Pan, Xiaodong

    2015-11-01

    Narcissus of cooled infrared system should be controlled strictly. So, accurate and rapid analysis of narcissus is very important. Deriving the SNR of narcissus based on the definition of noise equivalent power. Using simulation software CodeV and ASAP analyses the narcissus. Screen out the optical surface whose narcissus is serious in CodeV. Then build the model of the system in ASAP and add reasonable surface properties, get the result of size and average irradiance of the image narcissus spot by real ray tracing. Calculate the SNR of narcissus by putting the value of average irradiance into front formulation. On this basis, the simulation analysis and experimental test about the Narcissus of an infrared lens were performed. The experimental result is consistent with the simulation analysis.

  18. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950 s and 1960 s,Contractive Analysis(CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors.CA soon lost its popularity.Error Analysis(EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  19. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  20. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950s and 1960s,Contractive Analysis (CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors. CA soon lost its popularity.Error Analysis (EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  1. Topic recipe-based social simulation for research dynamics analysis

    OpenAIRE

    Lee, Keeheon; Kim, Chang Ouk

    2014-01-01

    In this paper, we introduce an agent-based modeling and simulation model for research dynamics analysis. Since researchers constitute research systems in research dynamics, modelling the behavior of a researcher is a key to this method. A researcher makes topic recipes for research products projecting his/her interest and fulfilling financial needs under his/her capability and topical trends. A topic recipe means a combination of topics in a research field. A topic can be related to a methodo...

  2. INNOVATION ANALYSIS BASED ON SCORES AT THE FIRM LEVEL

    OpenAIRE

    Cătălin George ALEXE; Cătălina Monica ALEXE; Gheorghe MILITARU

    2014-01-01

    Innovation analysis based on scores (Innovation Scorecard) is a simple way to get a quick diagnosis on the potential of innovation of a firm in its intention to achieve the innovation capability. It aims to identify and remedy the deficient aspects related to innovation management being used as a measuring tool for the innovation initiatives over time within the innovation audit. The paper aims to present the advantages and disadvantages of using the method, and the three approaches devel...

  3. COMMERCIAL VIABILITY ANALYSIS OF LIGNIN BASED CARBON FIBRE

    OpenAIRE

    Michael Chien-Wei Chen

    2014-01-01

    Lignin is a rich renewable source of aromatic compounds. As a potentialpetroleum feedstock replacement, lignin can reduce environmental impacts such ascarbon emission. Due to its complex chemical structure, lignin is currently underutilized.Exploiting lignin as a precursor for carbon fibre adds high economic value to lignin andencourages further development in lignin extraction technology. This report includes apreliminary cost analysis and identifies the key aspects of lignin-based carbon fi...

  4. A DST-based approach for construction project risk analysis

    OpenAIRE

    Taroun, A; J-B Yang

    2013-01-01

    Despite its huge potential in risk analysis, the Dempster–Shafer Theory of Evidence (DST) has not received enough attention in construction management. This paper presents a DST-based approach for structuring personal experience and professional judgment when assessing construction project risk. DST was innovatively used to tackle the problem of lacking sufficient information through enabling analysts to provide incomplete assessments. Risk cost is used as a common scale for measuring risk im...

  5. Performance-analysis-based gas turbine diagnostics: a review.

    OpenAIRE

    Li, Y.G.

    2002-01-01

    Gas turbine diagnostics has a history almost as long as gas turbine development itself. Early engine fault diagnosis was carried out based on manufacturer information supplied in a technical manual combined with maintenance experience. In the late 1960’s when Urban introduced Gas Path Analysis, gas turbine diagnostics made a big breakthrough. Since then different methods have been developed and used in both aero and industrial applications. Until now a substantial number of papers have been p...

  6. Comparative analysis of some brushless motors based on catalog data

    OpenAIRE

    Anton Kalapish; Dimitar Sotirov; Dimitrina Koeva

    2005-01-01

    Brushless motors (polyphased AC induction, synchronous and brushless DC motors) have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  7. Software for probabilistic-based durability analysis of concrete structures

    OpenAIRE

    Ferreira, Rui Miguel, ed. lit.; Jalali, Said

    2005-01-01

    In recent yeras, much reserach works has been carried out in order to obtain a more controlled durability and long-term performance in concrete structures in chloride containing environments. In particular, the developemnt of new procedures for probability-based durability design has been shown to provide a more realistic basis for the an analysis. Although relevant data is still lacking, this approach has been successfully applied to several new concrete structures, where requirements for a ...

  8. Software for probability-based durability analysis of concrete structures

    OpenAIRE

    Ferreira, Rui Miguel, ed. lit.; Jalali, Said

    2005-01-01

    In recent years, much research work has been carried out in order to obtain a more controlled durability and long-term performance of concrete structures in chloride containing environments. In particular, the development of new procedures for probability-based durability design has been shown to provide a more realistic basis for the analysis. Although relevant data is still lacking, this approach has been successfully applied to several new concrete structures, where requirements for ...

  9. Innovative Data Mining Based Approaches for Life Course Analysis

    OpenAIRE

    Ritschard, Gilbert; Gabadinho, Alexis; Mueller, Nicolas Séverin; Studer, Matthias

    2007-01-01

    This communication presents a just starting research project aiming at exploring the possibilities of resorting to data-mining-based methods in personal life course analysis. The project has also a socio-demographic goal, namely to gain new insights on how socio-demographic, familial, educational and professional events are entwined, on what are the characteristics of typical Swiss life trajectories and on changes in these characteristics over time. Methods for analyzing personal event histor...

  10. Wavelet Variance Analysis of EEG Based on Window Function

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yuan-zhuang; YOU Rong-yi

    2014-01-01

    A new wavelet variance analysis method based on window function is proposed to investigate the dynamical features of electroencephalogram (EEG).The ex-prienmental results show that the wavelet energy of epileptic EEGs are more discrete than normal EEGs, and the variation of wavelet variance is different between epileptic and normal EEGs with the increase of time-window width. Furthermore, it is found that the wavelet subband entropy (WSE) of the epileptic EEGs are lower than the normal EEGs.

  11. BLAT-Based Comparative Analysis for Transposable Elements: BLATCAT

    OpenAIRE

    Sangbum Lee; Sumin Oh; Keunsoo Kang; Kyudong Han

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program ca...

  12. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    OpenAIRE

    Lee Chien-Cheng; Huang Shin-Sheng; Shih Cheng-Yuan

    2010-01-01

    This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB) with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RD...

  13. A Multilevel Analysis of Problem-Based Learning Design Characteristics

    OpenAIRE

    Scott, Kimberly S.

    2014-01-01

    The increasing use of experience-centered approaches like problem-based learning (PBL) by learning and development practitioners and management educators has raised interest in how to design, implement and evaluate PBL in that field. Of particular interest is how to evaluate the relative impact of design characteristics that exist at the individual and team levels of analysis. This study proposes and tests a multilevel model of PBL design characteristics. Participant perceptions of PBL design...

  14. An Ultrasound Image Despeckling Approach Based on Principle Component Analysis

    OpenAIRE

    Jawad F. Al-Asad; Ali M. Reza; Udomchai Techavipoo

    2014-01-01

    An approach based on principle component analysis (PCA) to filter out multiplicative noise from ultrasound images is presented in this paper. An image with speckle noise is segmented into small dyadic lengths, depending on the original size of the image, and the global covariance matrix is found. A projection matrix is then formed by selecting the maximum eigenvectors of the global covariance matrix. This projection matrix is used to filter speckle noise by projecting each segment into the si...

  15. GIS-BASED SPATIAL STATISTICAL ANALYSIS OF COLLEGE GRADUATES EMPLOYMENT

    OpenAIRE

    Tang, R.

    2012-01-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004–2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were ...

  16. Analysis of Chimpanzee History Based on Genome Sequence Alignments

    OpenAIRE

    Caswell, Jennifer L.; Richter, Daniel J.; Neubauer, Julie; Schirmer, Christine; Gnerre, Sante; Mallick, Swapan; Reich, David Emil

    2008-01-01

    Population geneticists often study small numbers of carefully chosen loci, but it has become possible to obtain orders of magnitude for more data from overlaps of genome sequences. Here, we generate tens of millions of base pairs of multiple sequence alignments from combinations of three western chimpanzees, three central chimpanzees, an eastern chimpanzee, a bonobo, a human, an orangutan, and a macaque. Analysis provides a more precise understanding of demographic history than was previously...

  17. Analysis of Corporate Governance Performance Based on Grey System Theory

    OpenAIRE

    Hong Li

    2009-01-01

    This article sets up the grey relation analysis (GRA) model, and studies the case of the internal relevance between the factors of independent director participation of corporate governance and corporate performance, based on the listed companies of Electricity production sector and commercial retail sector in 2004. The research shows that, among the 4 impact factors of independent director participation of corporate governance, the most important factor impacting the achievement of the compa...

  18. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  19. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  20. Text-based ontology construction using relational concept analysis

    OpenAIRE

    Bendaoud, Rokia; Rouane Hacene, Amine Mohamed; Toussaint, Yannick; Delecroix, Bertrand; Napoli, Amedeo

    2007-01-01

    We present a semi-automated process that constructs an ontology based on a collection of document abstracts for a given domain. The proposed process relies on the formal concept analysis (\\fca), an algebraic method for the derivation of a conceptual hierarchy, namely '\\textit{concept lattice}', starting from data context, i.e., set of individuals provided with their properties. First, we show how various contexts are extracted and then how concepts of the corresponding lattices are turned int...

  1. Risk portofolio management under Zipf analysis based strategies

    CERN Document Server

    Bronlet, M A P

    2005-01-01

    A so called Zipf analysis portofolio management technique is introduced in order to comprehend the risk and returns. Two portofoios are built each from a well known financial index. The portofolio management is based on two approaches: one called the ''equally weighted portofolio'', the other the ''confidence parametrized portofolio''. A discussion of the (yearly) expected return, variance, Sharpe ratio and $\\beta$ follows. Optimization levels of high returns or low risks are found.

  2. Frame-based safety analysis approach for decision-based errors

    International Nuclear Information System (INIS)

    A frame-based approach is proposed to analyze decision-based errors made by automatic controllers or human operators due to erroneous reference frames. An integrated framework, Two Frame Model (TFM), is first proposed to model the dynamic interaction between the physical process and the decision-making process. Two important issues, consistency and competing processes, are raised. Consistency between the physical and logic frames makes a TFM-based system work properly. Loss of consistency refers to the failure mode that the logic frame does not accurately reflect the state of the controlled processes. Once such failure occurs, hazards may arise. Among potential hazards, the competing effect between the controller and the controlled process is the most severe one, which may jeopardize a defense-in-depth design. When the logic and physical frames are inconsistent, conventional safety analysis techniques are inadequate. We propose Frame-based Fault Tree; Analysis (FFTA) and Frame-based Event Tree Analysis (FETA) under TFM to deduce the context for decision errors and to separately generate the evolution of the logical frame as opposed to that of the physical frame. This multi-dimensional analysis approach, different from the conventional correctness-centred approach, provides a panoramic view in scenario generation. Case studies using the proposed techniques are also given to demonstrate their usage and feasibility

  3. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  4. Weather data analysis based on typical weather sequence analysis. Application: energy building simulation

    CERN Document Server

    David, Mathieu; Garde, Francois; Boyer, Harry

    2014-01-01

    In building studies dealing about energy efficiency and comfort, simulation software need relevant weather files with optimal time steps. Few tools generate extreme and mean values of simultaneous hourly data including correlation between the climatic parameters. This paper presents the C++ Runeole software based on typical weather sequences analysis. It runs an analysis process of a stochastic continuous multivariable phenomenon with frequencies properties applied to a climatic database. The database analysis associates basic statistics, PCA (Principal Component Analysis) and automatic classifications. Different ways of applying these methods will be presented. All the results are stored in the Runeole internal database that allows an easy selection of weather sequences. The extreme sequences are used for system and building sizing and the mean sequences are used for the determination of the annual cooling loads as proposed by Audrier-Cros (Audrier-Cros, 1984). This weather analysis was tested with the datab...

  5. Voxel selection in FMRI data analysis based on sparse representation.

    Science.gov (United States)

    Li, Yuanqing; Namburi, Praneeth; Yu, Zhuliang; Guan, Cuntai; Feng, Jianfeng; Gu, Zhenghui

    2009-10-01

    Multivariate pattern analysis approaches toward detection of brain regions from fMRI data have been gaining attention recently. In this study, we introduce an iterative sparse-representation-based algorithm for detection of voxels in functional MRI (fMRI) data with task relevant information. In each iteration of the algorithm, a linear programming problem is solved and a sparse weight vector is subsequently obtained. The final weight vector is the mean of those obtained in all iterations. The characteristics of our algorithm are as follows: 1) the weight vector (output) is sparse; 2) the magnitude of each entry of the weight vector represents the significance of its corresponding variable or feature in a classification or regression problem; and 3) due to the convergence of this algorithm, a stable weight vector is obtained. To demonstrate the validity of our algorithm and illustrate its application, we apply the algorithm to the Pittsburgh Brain Activity Interpretation Competition 2007 functional fMRI dataset for selecting the voxels, which are the most relevant to the tasks of the subjects. Based on this dataset, the aforementioned characteristics of our algorithm are analyzed, and a comparison between our method with the univariate general-linear-model-based statistical parametric mapping is performed. Using our method, a combination of voxels are selected based on the principle of effective/sparse representation of a task. Data analysis results in this paper show that this combination of voxels is suitable for decoding tasks and demonstrate the effectiveness of our method. PMID:19567340

  6. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  7. Voxel-based texture analysis of the brain.

    Science.gov (United States)

    Maani, Rouzbeh; Yang, Yee Hong; Kalra, Sanjay

    2015-01-01

    This paper presents a novel voxel-based method for texture analysis of brain images. Texture analysis is a powerful quantitative approach for analyzing voxel intensities and their interrelationships, but has been thus far limited to analyzing regions of interest. The proposed method provides a 3D statistical map comparing texture features on a voxel-by-voxel basis. The validity of the method was examined on artificially generated effects as well as on real MRI data in Alzheimer's Disease (AD). The artificially generated effects included hyperintense and hypointense signals added to T1-weighted brain MRIs from 30 healthy subjects. The AD dataset included 30 patients with AD and 30 age/sex matched healthy control subjects. The proposed method detected artificial effects with high accuracy and revealed statistically significant differences between the AD and control groups. This paper extends the usage of texture analysis beyond the current region of interest analysis to voxel-by-voxel 3D statistical mapping and provides a hypothesis-free analysis tool to study cerebral pathology in neurological diseases. PMID:25756621

  8. Voxel-based texture analysis of the brain.

    Directory of Open Access Journals (Sweden)

    Rouzbeh Maani

    Full Text Available This paper presents a novel voxel-based method for texture analysis of brain images. Texture analysis is a powerful quantitative approach for analyzing voxel intensities and their interrelationships, but has been thus far limited to analyzing regions of interest. The proposed method provides a 3D statistical map comparing texture features on a voxel-by-voxel basis. The validity of the method was examined on artificially generated effects as well as on real MRI data in Alzheimer's Disease (AD. The artificially generated effects included hyperintense and hypointense signals added to T1-weighted brain MRIs from 30 healthy subjects. The AD dataset included 30 patients with AD and 30 age/sex matched healthy control subjects. The proposed method detected artificial effects with high accuracy and revealed statistically significant differences between the AD and control groups. This paper extends the usage of texture analysis beyond the current region of interest analysis to voxel-by-voxel 3D statistical mapping and provides a hypothesis-free analysis tool to study cerebral pathology in neurological diseases.

  9. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  10. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton

    2012-01-01

    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  11. Data base for analysis of RA-1 reactor decommissioning

    International Nuclear Information System (INIS)

    Full text: The RA-1 'Enrico Fermi' reactor is located at the Constituyentes Atomic Centre near Buenos Aires city. It reached criticality for the first time on January 17th, 1958. During 45 years, major modifications were introduced. The Decommissioning of RA-1 is not foreseen in a near future, but nevertheless CNEA (legally responsible of Dismantling and Decommissioning of all relevant nuclear facilities in Argentina) has started Dismantling and Decommissioning planning activities. As a part of these activities, a historical information data base of the RA-1 reactor was performed. This report contains a set of global and specific data of the RA-1 Research Reactor prepared as a data base for a future decommissioning analysis. It describes the whole installation, and specially the core, fuel type, present configuration, shielding and the operation devices. An exhaustive listing of materials and components located in the Reactor building is given. It also reconstructs the reactor operation history based on the available information. (author)

  12. DNA sequence analysis using hierarchical ART-based classification networks

    Energy Technology Data Exchange (ETDEWEB)

    LeBlanc, C.; Hruska, S.I. [Florida State Univ., Tallahassee, FL (United States); Katholi, C.R.; Unnasch, T.R. [Univ. of Alabama, Birmingham, AL (United States)

    1994-12-31

    Adaptive resonance theory (ART) describes a class of artificial neural network architectures that act as classification tools which self-organize, work in real-time, and require no retraining to classify novel sequences. We have adapted ART networks to provide support to scientists attempting to categorize tandem repeat DNA fragments from Onchocerca volvulus. In this approach, sequences of DNA fragments are presented to multiple ART-based networks which are linked together into two (or more) tiers; the first provides coarse sequence classification while the sub- sequent tiers refine the classifications as needed. The overall rating of the resulting classification of fragments is measured using statistical techniques based on those introduced to validate results from traditional phylogenetic analysis. Tests of the Hierarchical ART-based Classification Network, or HABclass network, indicate its value as a fast, easy-to-use classification tool which adapts to new data without retraining on previously classified data.

  13. Depth-based selective image reconstruction using spatiotemporal image analysis

    Science.gov (United States)

    Haga, Tetsuji; Sumi, Kazuhiko; Hashimoto, Manabu; Seki, Akinobu

    1999-03-01

    In industrial plants, a remote monitoring system which removes physical tour inspection is often considered desirable. However the image sequence given from the mobile inspection robot is hard to see because interested objects are often partially occluded by obstacles such as pillars or fences. Our aim is to improve the image sequence that increases the efficiency and reliability of remote visual inspection. We propose a new depth-based image processing technique, which removes the needless objects from the foreground and recovers the occluded background electronically. Our algorithm is based on spatiotemporal analysis that enables fine and dense depth estimation, depth-based precise segmentation, and accurate interpolation. We apply this technique to a real time sequence given from the mobile inspection robot. The resulted image sequence is satisfactory in that the operator can make correct visual inspection with less fatigue.

  14. Theoretical Performance Analysis of Eigenvalue-based Detection

    CERN Document Server

    Penna, Federico

    2009-01-01

    In this paper we develop a complete analytical framework based on Random Matrix Theory for the performance evaluation of Eigenvalue-based Detection. While, up to now, analysis was limited to false-alarm probability, we have obtained an analytical expression also for the probability of missed detection, by using the theory of spiked population models. A general scenario with multiple signals present at the same time is considered. The theoretical results of this paper allow to predict the error probabilities, and to set the decision threshold accordingly, by means of a few mathematical formulae. In this way the design of an eigenvalue-based detector is made conceptually identical to that of a traditional energy detector. As additional results, the paper discusses the conditions of signal identifiability for single and multiple sources. All the analytical results are validated through numerical simulations, covering also convergence, identifiabilty and non-Gaussian practical modulations.

  15. FLDA: Latent Dirichlet Allocation Based Unsteady Flow Analysis.

    Science.gov (United States)

    Hong, Fan; Lai, Chufan; Guo, Hanqi; Shen, Enya; Yuan, Xiaoru; Li, Sikun

    2014-12-01

    In this paper, we present a novel feature extraction approach called FLDA for unsteady flow fields based on Latent Dirichlet allocation (LDA) model. Analogous to topic modeling in text analysis, in our approach, pathlines and features in a given flow field are defined as documents and words respectively. Flow topics are then extracted based on Latent Dirichlet allocation. Different from other feature extraction methods, our approach clusters pathlines with probabilistic assignment, and aggregates features to meaningful topics at the same time. We build a prototype system to support exploration of unsteady flow field with our proposed LDA-based method. Interactive techniques are also developed to explore the extracted topics and to gain insight from the data. We conduct case studies to demonstrate the effectiveness of our proposed approach. PMID:26356968

  16. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos

    2012-03-01

    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  17. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna;

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various...... kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...... to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified....

  18. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  19. Gating treatment delivery QA based on a surrogate motion analysis

    International Nuclear Information System (INIS)

    Full text: To develop a methodology to estimate intrafractional target position error during a phase-based gated treatment. Westmead Cancer Care Centre is using respiratory correlated phase-based gated beam delivery in the treatment of lung cancer. The gating technique is managed by the Varian Real-time Position Management (RPM) system, version 1.7.5. A 6-dot block is placed on the abdomen of the patient and acts as a surrogate for the target motion. During a treatment session, the motion of the surrogate can be recorded by RPM application. Analysis of the surrogate motion file by in-house developed software allows the intrafractional error of the treatment session to be computed. To validate the computed error, a simple test that involves the introduction of deliberate errors is performed. Errors of up to 1.1 cm are introduced to a metal marker placed on a surrogate using the Varian Breathing Phantom. The moving marker was scanned in prospective mode using a GE Lightspeed 16 CT scanner. Using the CT images, a difference of the marker position with and without introduced errors is compared to the calculated errors based on the surrogate motion. The average and standard deviation of a difference between calculated target position errors and measured introduced artificial errors of the marker position is 0.02 cm and 0.07 cm respectively. Conclusion The calculated target positional error based on surrogate motion analysis provides a quantitative measure of intrafractional target positional errors during treatment. Routine QA for gated treatment using surrogate motion analysis is relatively quick and simple.

  20. Clustering analysis of ancient celadon based on SOM neural network

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In the study, chemical compositions of 48 fragments of ancient ceramics excavated in 4 archaeological kiln sites which were located in 3 cities (Hangzhou, Cixi and Longquan in Zhejiang Province, China) have been examined by energy-dispersive X-ray fluorescence (EDXRF) technique. Then the method of SOM was introduced into the clustering analysis based on the major and minor element compositions of the bodies, the results manifested that 48 samples could be perfectly distributed into 3 locations, Hangzhou, Cixi and Longquan. Because the major and minor element compositions of two Royal Kilns were similar to each other, the classification accuracy over them was merely 76.92%. In view of this, the authors have made a SOM clustering analysis again based on the trace element compositions of the bodies, the classification accuracy rose to 84.61%. These results indicated that discrepancies in the trace element compositions of the bodies of the ancient ceramics excavated in two Royal Kiln sites were more distinct than those in the major and minor element compositions, which was in accordance with the fact. We argued that SOM could be employed in the clustering analysis of ancient ceramics.

  1. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  2. A graph-based system for network-vulnerability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  3. Finite element analysis of osteoporosis models based on synchrotron radiation

    Science.gov (United States)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m‑1 to 697.41 Fμ m‑1, the bending and torsion stiffness were from 1390.80 Fμ m‑1 to 566.11 Fμ m‑1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  4. Finite element analysis of osteoporosis models based on synchrotron radiation

    International Nuclear Information System (INIS)

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m−1 to 697.41 Fμ m−1, the bending and torsion stiffness were from 1390.80 Fμ m−1 to 566.11 Fμ m−1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis

  5. Principle-based concept analysis: Caring in nursing education

    Science.gov (United States)

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development

  6. Plasmacytoma of the Skull Base: A Meta-Analysis.

    Science.gov (United States)

    Na'ara, Shorook; Amit, Moran; Gil, Ziv; Billan, Salem

    2016-02-01

    Objective Extramedullary plasmacytomas are rare tumors. In the current study we aim to characterize its clinical course at the skull base and define the most appropriate therapeutic protocol. Methods We conducted a meta-analysis of articles in the English language that included data on the treatment and outcome of plasmacytoma of the base of skull. Results The study cohort consisted of 47 patients. The tumor originated from the clivus and sphenoclival region in 28 patients (59.5%), the nasopharynx in 10 patients (21.2%), the petrous apex in 5 patients (10.6%), and the orbital roof in 4 patients (8.5%). The chief complaints at presentation included recurrent epistaxis and cranial nerve palsy, according to the site of tumor. Twenty-two patients (46.8%) had surgical treatment; 25 (53.2%) received radiation therapy. Adjuvant therapy was administered in 11 cases (50%) with concurrent multiple myeloma. The 2-year and 5-year overall survival rates were 78% and 59%, respectively. Clear margin resection was achieved in a similar proportion of patients who underwent endoscopic surgery and open surgery (p = 0.83). A multivariate analysis of outcome showed a similar survival rate of patients treated surgically or with radiotherapy. Conclusions The mainstay of treatment for plasmacytoma is based on radiation therapy, but when total resection is feasible, endoscopic resection is a valid option. PMID:26949590

  7. [Determination of body fluid based on analysis of nucleic acids].

    Science.gov (United States)

    Korabečná, Marie

    2015-01-01

    Recent methodological approaches of molecular genetics allow isolation of nucleic acids (DNA and RNA) from negligible forensic samples. Analysis of these molecules may be used not only for individual identification based on DNA profiling but also for the detection of origin of the body fluid which (alone or in mixture with other body fluids) forms the examined biological trace. Such an examination can contribute to the evaluation of procedural, technical and tactical value of the trace. Molecular genetic approaches discussed in the review offer new possibilities in comparison with traditional spectrum of chemical, immunological and spectroscopic tests especially with regard to the interpretation of mixtures of biological fluids and to the confirmatory character of the tests. Approaches based on reverse transcription of tissue specific mRNA and their subsequent polymerase chain reaction (PCR) and fragmentation analysis are applicable on samples containing minimal amounts of biological material. Methods for body fluid discrimination based on examination of microRNA in samples provided so far confusing results therefore further development in this field is needed. The examination of tissue specific methylation of nucleotides in selected gene sequences seems to represent a promising enrichment of the methodological spectrum. The detection of DNA sequences of tissue related bacteria has been established and it provides satisfactory results mainly in combination with above mentioned methodological approaches. PMID:26419517

  8. ALGORITHMS FOR TENNIS RACKET ANALYSIS BASED ON MOTION DATA

    Directory of Open Access Journals (Sweden)

    Maria Skublewska-Paszkowska

    2016-09-01

    Full Text Available Modern technologies, such as motion capture systems (both optical and markerless, are more and more frequently used for athlete performance analysis due to their great precision. Optical systems based on the retro-reflective markers allow for tracking motion of multiple objects of various types. These systems compute human kinetic and kinematic parameters based on biomechanical models. Tracking additional objects like a tennis racket is also a very important aspect for analysing the player’s technique and precision. The motion data gathered by motion capture systems may be used for analysing various aspects that may not be recognised by the human eye or a video camera. This paper presents algorithms for analysis of a tennis racket motion during two of the most important tennis strokes: forehand and backhand. An optical Vicon system was used for obtaining the motion data which was the input for the algorithms. They indicate: the velocity of a tennis racket’s head and the racket’s handle based on the trajectories of attached markers as well as the racket’s orientation. The algorithms were implemented and tested on the data obtained from a professional trainer who participated in the research and performed a series of ten strikes, separately for: 1 forehand without a ball, 2 backhand without a ball, 3 forehand with a ball and 4 backhand with a ball. The computed parameters are gathered in tables and visualised in a graph.

  9. Weight measurement using image-based pose analysis

    Institute of Scientific and Technical Information of China (English)

    Hong Zhang; Kui Zhang; Ying Mu; Ning Yao; Robert J. Sclabassi; Mingui Sun

    2008-01-01

    Image-based gait analysis as a means of biometric identification has attracted much research attention.Most of the existing methods focus on human identification,posture analysis and movement tracking.There have been few investigations on measuring the carried load based on the carrier's gait characteristics by automatic image processing.Nevertheless,this measurement is very useful in a number of applications,such as the study of the carried load on the postural development of children and adolescence.In this paper,we inves-tigate how to automatically estimate the carried weight from a sequence of images.We present a method to extract human gait silhouette based on an observation that humans tend to minimize the energy during motion.We compute several angles of body leaning and deter-mine the relationship of the carried weight,the leaning angles and the centroid location according to a human kinetic study.Our weight determination method has been verified successfully by experiments.

  10. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  11. Wavelet-based finite element analysis of composites

    International Nuclear Information System (INIS)

    Full text: Wavelet analysis became recently very popular in the area of composite materials modeling since their multiscale and stochastic nature. Most of the people including engineers, scientists and even ordinary people involved in designing, manufacturing and the usage of composites are usually interested in their global behavior rather than the multiphysical phenomena appearing at different scales of their complicated multilevel structure. Therefore, the most important topic is to build the efficient mathematical and numerical algorithm to analyze multiscale heterogeneous materials and structures. As it is known, thanks to the homogenization theory we can follow essentially two different paths to achieve this goal. First, the composite can be directly analyzed using the wavelet-based FEM approach. Concurrently, we can use the wavelet-based homogenization algorithm to determine effective material parameters and next, to carry out classical FEM or another related method based computations. The basic difference between those approaches is that the wavelet decomposition and construction algorithms are incorporated into the matrix FEM computations in the first method; therefore, the additional computer code should be modified. The second method is based on rather symbolic computations necessary for determination of the effective material parameters, while the structural analysis is classical. The computational strategy presented by the author is based on the homogenization method where the dynamics of the linear elastic heterogeneous beam is studied for the following general case: ∂/∂x (E(x)∂u/∂x) = n(x) ∂2u/∂t2; where both Young modulus E(x) and the composite mass density p(x) are defined by some wavelets. First, the effective material parameters of the beam are determined; then, the structural behavior of the homogenized system is determined numerically and compared against the real structure vibrations. Analogous analysis is done for the composite

  12. Psychoacoustic Music Analysis Based on the Discrete Wavelet Packet Transform

    Directory of Open Access Journals (Sweden)

    Xing He

    2008-01-01

    Full Text Available Psychoacoustical computational models are necessary for the perceptual processing of acoustic signals and have contributed significantly in the development of highly efficient audio analysis and coding. In this paper, we present an approach for the psychoacoustic analysis of musical signals based on the discrete wavelet packet transform. The proposed method mimics the multiresolution properties of the human ear closer than other techniques and it includes simultaneous and temporal auditory masking. Experimental results show that this method provides better masking capabilities and it reduces the signal-to-masking ratio substantially more than other approaches, without introducing audible distortion. This model can lead to greater audio compression by permitting further bit rate reduction and more secure watermarking by providing greater signal space for information hiding.

  13. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  14. Analysis and design of a smart card based authentication protocol

    Institute of Scientific and Technical Information of China (English)

    Kuo-Hui YEH; Kuo-Yu TSAI; Jia-Li HOU

    2013-01-01

    Numerous smart card based authentication protocols have been proposed to provide strong system security and robust individual privacy for communication between parties these days. Nevertheless, most of them do not provide formal analysis proof, and the security robustness is doubtful. Chang and Cheng (2011) proposed an efficient remote authentication protocol with smart cards and claimed that their proposed protocol could support secure communication in a multi-server environment. Unfortunately, there are opportunities for security enhancement in current schemes. In this paper, we identify the major weakness, i.e., session key disclosure, of a recently published protocol. We consequently propose a novel authentication scheme for a multi-server envi-ronment and give formal analysis proofs for security guarantees.

  15. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  16. GIS based analysis of future district heating potential in Denmark

    DEFF Research Database (Denmark)

    Nielsen, Steffen; Möller, Bernd

    2013-01-01

    The physical placement of buildings is important when determining the potential for DH (district heating). Good locations for DH are mainly determined by having both a large heat demand within a certain area and having access to local heat resources. In recent years, the locations of buildings in...... Denmark have been mapped in a heat atlas which includes all buildings and their heat demands. This article focuses on developing a method for assessing the costs associated with supplying these buildings with DH. The analysis is based on the existing DH areas in Denmark. By finding the heat production...... feasible to expand DH in many areas, but others would require reductions in production costs and distribution losses in order for DH expansions to be economically feasible. The analysis also shows the potential boundaries for DH expansion by including transmission and distribution costs. These boundaries...

  17. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses the...... geometry, material properties and fixed point characteristics to calculate the dimensions and subsequent feasibility of any architectural design. The proposed conceptual design tool provides the possibility for the architect to work with both the aesthetic as well as the structural aspects of architecture...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...

  18. GPS baseline configuration design based on robustness analysis

    Science.gov (United States)

    Yetkin, M.; Berber, M.

    2012-11-01

    The robustness analysis results obtained from a Global Positioning System (GPS) network are dramatically influenced by the configurationof the observed baselines. The selection of optimal GPS baselines may allow for a cost effective survey campaign and a sufficiently robustnetwork. Furthermore, using the approach described in this paper, the required number of sessions, the baselines to be observed, and thesignificance levels for statistical testing and robustness analysis can be determined even before the GPS campaign starts. In this study, wepropose a robustness criterion for the optimal design of geodetic networks, and present a very simple and efficient algorithm based on thiscriterion for the selection of optimal GPS baselines. We also show the relationship between the number of sessions and the non-centralityparameter. Finally, a numerical example is given to verify the efficacy of the proposed approach.

  19. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  20. Lossless droplet transfer of droplet-based microfluidic analysis

    Science.gov (United States)

    Kelly, Ryan T; Tang, Keqi; Page, Jason S; Smith, Richard D

    2011-11-22

    A transfer structure for droplet-based microfluidic analysis is characterized by a first conduit containing a first stream having at least one immiscible droplet of aqueous material and a second conduit containing a second stream comprising an aqueous fluid. The interface between the first conduit and the second conduit can define a plurality of apertures, wherein the apertures are sized to prevent exchange of the first and second streams between conduits while allowing lossless transfer of droplets from the first conduit to the second conduit through contact between the first and second streams.

  1. Machine Learning for Vision-Based Motion Analysis

    CERN Document Server

    Wang, Liang; Cheng, Li; Pietikainen, Matti

    2011-01-01

    Techniques of vision-based motion analysis aim to detect, track, identify, and generally understand the behavior of objects in image sequences. With the growth of video data in a wide range of applications from visual surveillance to human-machine interfaces, the ability to automatically analyze and understand object motions from video footage is of increasing importance. Among the latest developments in this field is the application of statistical machine learning algorithms for object tracking, activity modeling, and recognition. Developed from expert contributions to the first and second In

  2. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Science.gov (United States)

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  3. Workforce Planning of Navigation Software Project Based on Competence Analysis

    Directory of Open Access Journals (Sweden)

    Shangfei Xie

    2011-03-01

    Full Text Available This paper introduces the quantitative research method for the personnel configuration of the software project by studying the effects of the overall competence of the developers in the vehicle navigation software project on the factors like project quality. The study shows the overall competence of the developers is related to the after-submission defect density, productivity and the average delay of software Version 0.99. Further, a quantitative formula of competence and competence is drawn on the basis of statistics; meanwhile, according to the research result an integer programming configuration method for the navigation software project personnel based on competence analysis is concluded.

  4. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  5. Rural Power System Load Forecast Based on Principal Component Analysis

    Institute of Scientific and Technical Information of China (English)

    Fang Jun-long; Xing Yu; Fu Yu; Xu Yang; Liu Guo-liang

    2015-01-01

    Power load forecasting accuracy related to the development of the power system. There were so many factors influencing the power load, but their effects were not the same and what factors played a leading role could not be determined empirically. Based on the analysis of the principal component, the paper forecasted the demands of power load with the method of the multivariate linear regression model prediction. Took the rural power grid load for example, the paper analyzed the impacts of different factors on power load, selected the forecast methods which were appropriate for using in this area, forecasted its 2014-2018 electricity load, and provided a reliable basis for grid planning.

  6. Web-Based Instruction and Learning: Analysis and Needs Assessment

    Science.gov (United States)

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany

    1998-01-01

    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  7. A Developed Algorithm of Apriori Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    LI Pingxiang; CHEN Jiangping; BIAN Fuling

    2004-01-01

    A method for mining frequent itemsets by evaluating their probability of supports based on association analysis is presented. This paper obtains the probability of every 1-itemset by scanning the database, then evaluates the probability of every 2-itemset, every 3-itemset, every k-itemset from the frequent 1-itemsets and gains all the candidate frequent itemsets. This paper also scans the database for verifying the support of the candidate frequent itemsets. Last, the frequent itemsets are mined. The method reduces a lot of time of scanning database and shortens the computation time of the algorithm.

  8. All-polymer microfluidic systems for droplet based sample analysis

    DEFF Research Database (Denmark)

    Poulsen, Carl Esben

    , droplet packing, imaging and amplification (heating). The project has been broken into sub-projects, in which several devices of simpler application have been developed. Most of these employ gravity for concentrating and packing droplets, which has been made possible by the use of large area chambers...... energy directors for ultrasonic welding of microfluidic systems have been presented: 1. Tongue-and-groove energy directors. 2. Laser ablated micropillar energy directors. • Fabrication: Annealing of polymer devices for use with hydrocarbon based multiphase systems. • Experimental design and data analysis...

  9. Differentiation-Based Analysis of Environmental Management and Corporate Performance

    Institute of Scientific and Technical Information of China (English)

    SHAN Dong-ming; MU Xin

    2007-01-01

    By building a duopoly model based on product differentiation, both of the clean firm's and the dirty firm's performances are studied under the assumptions that consumers have different preferences for the product environmental attributes, and that the product cost increases with the environmental attribute. The analysis results show that under either the case with no environmental regulation or that with a tariff levied on the dirty product, the clean firm would always get more profit. In addition, the stricter the regulation is, the more profit the clean firm would obtain. This can verify that from the view of product differentiation, a firm could improve its corporate competitiveness with environmental management.

  10. Selecting supplier combination based on fuzzy multicriteria analysis

    Science.gov (United States)

    Han, Zhi-Qiu; Luo, Xin-Xing; Chen, Xiao-Hong; Yang, Wu-E.

    2015-07-01

    Existing multicriteria analysis (MCA) methods are probably ineffective in selecting a supplier combination. Thus, an MCA-based fuzzy 0-1 programming method is introduced. The programming relates to a simple MCA matrix that is used to select a single supplier. By solving the programming, the most feasible combination of suppliers is selected. Importantly, this result differs from selecting suppliers one by one according to a single-selection order, which is used to rank sole suppliers in existing MCA methods. An example highlights such difference and illustrates the proposed method.

  11. Dilution-of-Precision-Based Lunar Surface Navigation System Analysis Utilizing Earth-Based Assets

    Science.gov (United States)

    Welch, Bryan W.; Connolly, Joseph W.; Sands, Obed S.

    2007-01-01

    The NASA Vision for Space Exploration is focused on the return of astronauts to the Moon. Although navigation systems have already been proven in the Apollo missions to the Moon, the current exploration campaign will involve more extensive and extended missions requiring new concepts for lunar navigation. In contrast to Apollo missions, which were limited to the near-side equatorial region of the Moon, those under the Exploration Systems Initiative will require navigation on the Moon's limb and far side. These regions are known to have poor Earth visibility, but unknown is the extent to which a navigation system comprised solely of Earth-based tracking stations will provide adequate navigation solutions in these areas. This report presents a dilution-of-precision (DoP)-based analysis of the performance of a network of Earth-based assets. This analysis extends a previous analysis of a lunar network (LN) of navigation satellites by providing an assessment of the capability associated with a variety of assumptions. These assumptions pertain to the minimum provider elevation angle, nadir and zenith beam widths, and a total single failure in one of the Earth-based assets. The assessment is accomplished by making appropriately formed estimates of DoP. Different adaptations of DoP, such as geometrical DoP and positional DoP (GDoP and PDoP), are associated with a different set of assumptions regarding augmentations to the navigation receiver or transceiver.

  12. Linear feature selection in texture analysis - A PLS based method

    DEFF Research Database (Denmark)

    Marques, Joselene; Igel, Christian; Lillholm, Martin;

    2013-01-01

    We present a texture analysis methodology that combined uncommitted machine-learning techniques and partial least square (PLS) in a fully automatic framework. Our approach introduces a robust PLS-based dimensionality reduction (DR) step to specifically address outliers and high-dimensional feature...... limited number of samples, the data were evaluated using Monte Carlo cross validation (CV). The developed DR method demonstrated consistency in selecting a relatively homogeneous set of features across the CV iterations. Per each CV group, a median of 19 % of the original features was selected and...... considering all CV groups, the methods selected 36 % of the original features available. The diagnosis evaluation reached a generalization area-under-the-ROC curve of 0.92, which was higher than established cartilage-based markers known to relate to OA diagnosis....

  13. An effective approximation for variance-based global sensitivity analysis

    International Nuclear Information System (INIS)

    The paper presents a fairly efficient approximation for the computation of variance-based sensitivity measures associated with a general, n-dimensional function of random variables. The proposed approach is based on a multiplicative version of the dimensional reduction method (M-DRM), in which a given complex function is approximated by a product of low dimensional functions. Together with the Gaussian quadrature, the use of M-DRM significantly reduces the computation effort associated with global sensitivity analysis. An important and practical benefit of the M-DRM is the algebraic simplicity and closed-form nature of sensitivity coefficient formulas. Several examples are presented to show that the M-DRM method is as accurate as results obtained from simulations and other approximations reported in the literature

  14. Single base pair mutation analysis by PNA directed PCR clamping

    DEFF Research Database (Denmark)

    Ørum, H.; Nielsen, P.E.; Egholm, M.; Berg, R.H.; Buchardt, O.; Stanley, C.

    1993-01-01

    A novel method that allows direct analysis of single base mutation by the polymerase chain reaction (PCR) is described. The method utilizes the finding that PNAs (peptide nucleic acids) recognize and bind to their complementary nucleic acid sequences with higher thermal stability and specificity...... than the corresponding deoxyribooligonucleotides and that they cannot function as primers for DNA polymerases. We show that a PNA/DNA complex can effectively block the formation of a PCR product when the PNA is targeted against one of the PCR primer sites. Furthermore, we demonstrate that this blockage...... allows selective amplification/suppression of target sequences that differ by only one base pair. Finally we show that PNAs can be designed in such a way that blockage can be accomplished when the PNA target sequence is located between the PCR primers....

  15. Supermarket Analysis Based On Product Discount and Statistics

    Directory of Open Access Journals (Sweden)

    Komal Kumawat

    2014-03-01

    Full Text Available E-commerce has been growing rapidly. Its domain can provide all the right ingredients for successful data mining and it is a significant domain of data mining. E commerce refers to buying and selling of products or services over electronic systems such as internet. Various e commerce systems give discount on product and allow user to buy product online. The basic idea used here is to predict the product sale based on discount applied to the product. Our analysis concentrates on how customer behaves when discount is allotted to him. We have developed a model which finds the customer behaviour when discount is applied to the product. This paper elaborates upon how a different technique like session, click stream is used to collect user data online based on discount applied to the product and how statistics is applied to data set to see the variation in the data.

  16. A New Customer Segmentation Framework Based on Biclustering Analysis

    Directory of Open Access Journals (Sweden)

    Xiaohui Hu

    2014-06-01

    Full Text Available The paper presents a novel approach for customer segmentation which is the basic issue for an effective CRM ( Customer Relationship Management . Firstly, the chi-square statistical analysis is applied to choose set of attributes and K-means algorithm is employed to quantize the value of each attribute. Then DBSCAN algorithm based on density is introduced to classify the customers into three groups (the first, the second and the third class. Finally biclustering based on improved Apriori algorithm is used in the three groups to obtain more detailed information. Experimental results on the dataset of an airline company show that the biclustering could segment the customers more accurately and meticulously. Compared with the traditional customer segmentation method, the framework described is more efficient on the dataset.

  17. BLAT-Based Comparative Analysis for Transposable Elements: BLATCAT

    Directory of Open Access Journals (Sweden)

    Sangbum Lee

    2014-01-01

    Full Text Available The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT based comparative analysis for transposable elements (BLATCAT program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes.

  18. BLAT-based comparative analysis for transposable elements: BLATCAT.

    Science.gov (United States)

    Lee, Sangbum; Oh, Sumin; Kang, Keunsoo; Han, Kyudong

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque) on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes. PMID:24959585

  19. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    data example from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a...... global brand, H&M and conduct a sentiment classification of the posts and comments. Fifth, we analyse the sentiment classifications by constructing crisp as well as the fuzzy sets of the artefacts (posts, comments, likes, and shares). We document and discuss the longitudinal sentiment profiles of...... artefacts and actors on the facebook page. Sixth and last, we discuss the analytical method and conclude with a discussion of the benefits of set theoretical approaches based on the social philosophical approach of associational sociology....

  20. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a global brand......, H&M and conduct a sentiment classification of the posts and comments. Fifth, we analyse the sentiment classifications by constructing crisp as well as the fuzzy sets of the artefacts (posts, comments, likes, and shares). We document and discuss the longitudinal sentiment profiles of artefacts and...... actors on the facebook page. Sixth and last, we discuss the analytical method and conclude with a discussion of the benefits of set theoretical approaches based on the social philosophical approach of associational sociology....

  1. Sensitivity analysis of GSI based mechanical characterization of rock mass

    CERN Document Server

    Ván, P

    2012-01-01

    Recently, the rock mechanical and rock engineering designs and calculations are frequently based on Geological Strength Index (GSI) method, because it is the only system that provides a complete set of mechanical properties for design purpose. Both the failure criteria and the deformation moduli of the rock mass can be calculated with GSI based equations, which consists of the disturbance factor, as well. The aim of this paper is the sensitivity analysis of GSI and disturbance factor dependent equations that characterize the mechanical properties of rock masses. The survey of the GSI system is not our purpose. The results show that the rock mass strength calculated by the Hoek-Brown failure criteria and both the Hoek-Diederichs and modified Hoek-Diederichs deformation moduli are highly sensitive to changes of both the GSI and the D factor, hence their exact determination is important for the rock engineering design.

  2. Least-squares deconvolution based analysis of stellar spectra

    CERN Document Server

    Van Reeth, T; Tsymbal, V

    2013-01-01

    In recent years, astronomical photometry has been revolutionised by space missions such as MOST, CoRoT and Kepler. However, despite this progress, high-quality spectroscopy is still required as well. Unfortunately, high-resolution spectra can only be obtained using ground-based telescopes, and since many interesting targets are rather faint, the spectra often have a relatively low S/N. Consequently, we have developed an algorithm based on the least-squares deconvolution profile, which allows to reconstruct an observed spectrum, but with a higher S/N. We have successfully tested the method using both synthetic and observed data, and in combination with several common spectroscopic applications, such as e.g. the determination of atmospheric parameter values, and frequency analysis and mode identification of stellar pulsations.

  3. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  4. Comparative Analysis of a Principal Component Analysis-Based and an Artificial Neural Network-Based Method for Baseline Removal.

    Science.gov (United States)

    Carvajal, Roberto C; Arias, Luis E; Garces, Hugo O; Sbarbaro, Daniel G

    2016-04-01

    This work presents a non-parametric method based on a principal component analysis (PCA) and a parametric one based on artificial neural networks (ANN) to remove continuous baseline features from spectra. The non-parametric method estimates the baseline based on a set of sampled basis vectors obtained from PCA applied over a previously composed continuous spectra learning matrix. The parametric method, however, uses an ANN to filter out the baseline. Previous studies have demonstrated that this method is one of the most effective for baseline removal. The evaluation of both methods was carried out by using a synthetic database designed for benchmarking baseline removal algorithms, containing 100 synthetic composed spectra at different signal-to-baseline ratio (SBR), signal-to-noise ratio (SNR), and baseline slopes. In addition to deomonstrating the utility of the proposed methods and to compare them in a real application, a spectral data set measured from a flame radiation process was used. Several performance metrics such as correlation coefficient, chi-square value, and goodness-of-fit coefficient were calculated to quantify and compare both algorithms. Results demonstrate that the PCA-based method outperforms the one based on ANN both in terms of performance and simplicity. PMID:26917856

  5. Electromagnetic and motion-coupled analysis for switched reluctance motor based on reluctance network analysis

    International Nuclear Information System (INIS)

    This paper presents an electromagnetic and motion-coupled analysis method for switched reluctance (SR) motors based on a reluctance network analysis. First, we construct a reluctance network model of a 6/4-pole SR motor considering both magnetic saturation at pole-tips and leakage fluxes from stator poles. Next, we combine the reluctance network model with a motor drive circuit and motion calculation circuits. Using the combined model, we can accurately and readily calculate dynamic characteristics of the 6/4-pole SR motor such as exciting voltages, phase currents, torque, etc. We indicate validity of the proposed method by comparing with experimental values

  6. Design and analysis of ripple-based controls based on the discrete modeling and Floquet theory

    OpenAIRE

    Cortés, Jorge,; Svikovic, Vladimir; Alou Cervera, Pedro; Oliver Ramírez, Jesús Angel; Cobos Márquez, José Antonio

    2013-01-01

    Ripple-based controls can strongly reduce the required output capacitance in PowerSoC converter thanks to a very fast dynamic response. Unfortunately, these controls are prone to sub-harmonic oscillations and several parameters affect the stability of these systems. This paper derives and validates a simulation-based modeling and stability analysis of a closed-loop V 2Ic control applied to a 5 MHz Buck converter using discrete modeling and Floquet theory to predict stability. This allows the ...

  7. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-01-01

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions. PMID:26213928

  8. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  9. Service quality measurement. A new approach based on Conjoint Analysis

    Directory of Open Access Journals (Sweden)

    Valerio Gatta

    2013-03-01

    Full Text Available This article is concerned with the measurement of service quality. The main objective is to suggest an alternative criterion for service quality definition and measurement. After a brief description of the most traditional techniques and with the intent to overcome some critical factors pertaining them, I focus my attention on the choice-based conjoint analysis, a particular stated preferences method that estimates the structure of consumers’ preferences given their choices between alternative service options. Discrete choice models and the traditional compensatory utility maximization framework are extended by the inclusion of the attribute cutoffs into the decision problem formulation. The major theoretical aspects of the described approach are examined and discussed, showing that it is able to identify the relative importance of the relevant attributes, calculating elasticity and monetary evaluation, and to determine a service quality index. Then simulations enable the identification of potential service quality levels, so that marketing managers have valuable information to plan their best business strategies. We present findings from an empirical study in the public transport sector designed to gain insights into the use of the choice-based conjoint analysis.

  10. Reachability analysis based transient stability design in power systems

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Licheng; Kumar, Ratnesh; Elia, Nicola [Department of Electrical and Computer Engineering, Iowa State University, Ames, IA 50010 (United States)

    2010-09-15

    This paper provides a systematic framework to determine switching control strategies to stabilize the system after a fault if the stabilization is possible. A method to compute the stability region of a stable equilibrium point with the purpose of power system stability analysis is proposed and the validity of discrete controls in transient stability design is studied. First, a Hamilton-Jacobi-Isaas (HJI) partial differential equation (PDE) is constructed to describe the set of backward reachable states as a function of time starting from a target set of states. The backward reachable set of a stable equilibrium point is computed by numerically solving the HJI PDE backwardly in time using level set methods. This backward reachable set yields the stability region of the equilibrium point. Based on such reachability analysis, a transient stability design method is presented. The validity of a discrete control is determined by examining the stability region of the power system with the said control on. If a post-fault initial state is in the stability region of the system with a control on, the control is valid. A control strategy is provided based on the validity of controls. Finally, this method is illustrated by applying to a single machine infinite bus system with the compensation of shunt and series capacitors. (author)

  11. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach. PMID:16466842

  12. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  13. Data Clustering Analysis Based on Wavelet Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    QIANYuntao; TANGYuanyan

    2003-01-01

    A novel wavelet-based data clustering method is presented in this paper, which includes wavelet feature extraction and cluster growing algorithm. Wavelet transform can provide rich and diversified information for representing the global and local inherent structures of dataset. therefore, it is a very powerful tool for clustering feature extraction. As an unsupervised classification, the target of clustering analysis is dependent on the specific clustering criteria. Several criteria that should be con-sidered for general-purpose clustering algorithm are pro-posed. And the cluster growing algorithm is also con-structed to connect clustering criteria with wavelet fea-tures. Compared with other popular clustering methods,our clustering approach provides multi-resolution cluster-ing results,needs few prior parameters, correctly deals with irregularly shaped clusters, and is insensitive to noises and outliers. As this wavelet-based clustering method isaimed at solving two-dimensional data clustering prob-lem, for high-dimensional datasets, self-organizing mapand U-matrlx method are applied to transform them intotwo-dimensional Euclidean space, so that high-dimensional data clustering analysis,Results on some sim-ulated data and standard test data are reported to illus-trate the power of our method.

  14. Emergy analysis of cassava-based fuel ethanol in China

    International Nuclear Information System (INIS)

    Emergy analysis considers both energy quality and energy used in the past, and compensates for the inability of money to value non-market inputs in an objective manner. Its common unit allows all resources to be compared on a fair basis. As feedstock for fuel ethanol, cassava has some advantages over other feedstocks. The production system of cassava-based fuel ethanol (CFE) was evaluated by emergy analysis. The emergy indices for the system of cassava-based fuel ethanol (CFE) are as follows: transformity is 1.10 E + 5 sej/J, EYR is 1.07, ELR is 2.55, RER is 0.28, and ESI is 0.42. Compared with the emergy indices of wheat ethanol and corn ethanol, CFE is the most sustainable. CFE is a good alternative to substitute for oil in China. Non-renewable purchased emergy accounts for 71.15% of the whole input emergy. The dependence on non-renewable energy increases environmental degradation, making the system less sustainable relative to systems more dependent on renewable energies. For sustainable development, it is vital to reduce the consumption of non-renewable energy in the production of CFE. (author)

  15. Emergy analysis of cassava-based fuel ethanol in China

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hui; Chen, Li; Yan, Zongcheng; Wang, Honglin [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou, Guangdong 510640 (China)

    2011-01-15

    Emergy analysis considers both energy quality and energy used in the past, and compensates for the inability of money to value non-market inputs in an objective manner. Its common unit allows all resources to be compared on a fair basis. As feedstock for fuel ethanol, cassava has some advantages over other feedstocks. The production system of cassava-based fuel ethanol (CFE) was evaluated by emergy analysis. The emergy indices for the system of cassava-based fuel ethanol (CFE) are as follows: transformity is 1.10 E + 5 sej/J, EYR is 1.07, ELR is 2.55, RER is 0.28, and ESI is 0.42. Compared with the emergy indices of wheat ethanol and corn ethanol, CFE is the most sustainable. CFE is a good alternative to substitute for oil in China. Non-renewable purchased emergy accounts for 71.15% of the whole input emergy. The dependence on non-renewable energy increases environmental degradation, making the system less sustainable relative to systems more dependent on renewable energies. For sustainable development, it is vital to reduce the consumption of non-renewable energy in the production of CFE. (author)

  16. An Efficient Soft Set-Based Approach for Conflict Analysis

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  17. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  18. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  19. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  20. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  1. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    Science.gov (United States)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  2. Experimental investigation of thermal neutron analysis based landmine detection technology

    International Nuclear Information System (INIS)

    Background: Recently, the prompt gamma-rays neutron activation analysis method is wildly used in coal analysis and explosive detection, however there were less application about landmine detection using neutron method especially in the domestic research. Purpose: In order to verify the feasibility of Thermal Neutron Analysis (TNA) method used in landmine detection, and explore the characteristic of this technology. Methods: An experimental system of TNA landmine detection was built based on LaBr3 (Ce) fast scintillator detector and 252Cf isotope neutron source. The system is comprised of the thermal neutron transition system, the shield system, and the detector system. Results: On the basis of the TNA, the wide energy area calibration method especially to the high energy area was investigated, and the least detection time for a typical mine was defined. In this study, the 72-type anti-tank mine, the 500 g TNT sample and several interferential objects are tested in loess, red soil, magnetic soil and sand respectively. Conclusions: The experimental results indicate that TNA is a reliable demining method, and it can be used to confirm the existence of Anti-Tank Mines (ATM) and large Anti-Personnel Mines (APM) in complicated condition. (authors)

  3. Structural Optimization of Slender Robot Arm Based on Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2012-01-01

    Full Text Available An effective structural optimization method based on a sensitivity analysis is proposed to optimize the variable section of a slender robot arm. The structure mechanism and the operating principle of a polishing robot are introduced firstly, and its stiffness model is established. Then, a design of sensitivity analysis method and a sequential linear programming (SLP strategy are developed. At the beginning of the optimization, the design sensitivity analysis method is applied to select the sensitive design variables which can make the optimized results more efficient and accurate. In addition, it can also be used to determine the scale of moving step which will improve the convergency during the optimization process. The design sensitivities are calculated using the finite difference method. The search for the final optimal structure is performed using the SLP method. Simulation results show that the proposed structure optimization method is effective in enhancing the stiffness of the robot arm regardless of the robot arm suffering either a constant force or variable forces.

  4. Kinematics Analysis Based on Screw Theory of a Humanoid Robot

    Institute of Scientific and Technical Information of China (English)

    MAN Cui-hua; FAN Xun; LI Cheng-rong; ZHAO Zhong-hui

    2007-01-01

    A humanoid robot is a complex dynamic system for its idiosyncrasy. This paper aims to provide a mathematical and theoretical foundation for the design of the configuration, kinematics analysis of a novel humanoid robot. It has a simplified configuration and design for entertainment purpose. The design methods, principle and mechanism are discussed. According to the design goals of this research, there are ten degrees of freedom in the two bionic arms.Modularization, concurrent design and extension theory methods were adopted in the configuration study and screw theory was introduced into the analysis of humanoid robot kinematics. Comparisons with other methods show that: 1) only two coordinates need to be established in the kinematics analysis of humanoid robot based on screw theory; 2) the spatial manipulator Jacobian obtained by using twist and exponential product formula is succinct and legible; 3) adopting screw theory to resolve the humanoid robot arms kinematics question can avoid singularities; 4) using screw theory can solve the question of specification insufficiency.

  5. Graph-Based Analysis of Nuclear Smuggling Data

    International Nuclear Information System (INIS)

    Much of the data that is collected and analyzed today is structural, consisting not only of entities but also of relationships between the entities. As a result, analysis applications rely upon automated structural data mining approaches to find patterns and concepts of interest. This ability to analyze structural data has become a particular challenge in many security-related domains. In these domains, focusing on the relationships between entities in the data is critical to detect important underlying patterns. In this study we apply structural data mining techniques to automate analysis of nuclear smuggling data. In particular, we choose to model the data as a graph and use graph-based relational learning to identify patterns and concepts of interest in the data. In this paper, we identify the analysis questions that are of importance to security analysts and describe the knowledge representation and data mining approach that we adopt for this challenge. We analyze the results using the Russian nuclear smuggling event database.

  6. A graph-based network-vulnerability analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.

  7. A graph-based network-vulnerability analysis system

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, L.P.; Phillips, C. [Sandia National Labs., Albuquerque, NM (United States); Gaylor, T. [3M, Austin, TX (United States). Visual Systems Div.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  8. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  9. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  10. Technoeconomic analysis of a biomass based district heating system

    International Nuclear Information System (INIS)

    This paper discussed a proposed biomass-based district heating system to be built for the Pictou Landing First Nation Community in Nova Scotia. The community centre consists of 6 buildings and a connecting arcade. The methodology used to size and design heating, ventilating and air conditioning (HVAC) systems, as well as biomass district energy systems (DES) were discussed. Annual energy requirements and biomass fuel consumption predictions were presented, along with cost estimates. A comparative assessment of the system with that of a conventional oil fired system was also conducted. It was suggested that the design and analysis methodology could be used for any similar application. The buildings were modelled and simulated using the Hourly Analysis Program (HAP), a detailed 2-in-1 software program which can be used both for HVAC system sizing and building energy consumption estimation. A techno-economics analysis was conducted to justify the viability of the biomass combustion system. Heating load calculations were performed assuming that the thermostat was set constantly at 22 degrees C. Community centre space heating loads due to individual envelope components for 3 different scenarios were summarized, as the design architecture for the buildings was not yet finalized. It was suggested that efforts should be made to ensure air-tightness and insulation levels of the interior arcade glass wall. A hydronic distribution system with baseboard space heating units was selected, comprising of a woodchip boiler, hot water distribution system, convective heating units and control systems. The community has its own logging operation which will provide the wood fuel required by the proposed system. An outline of the annual allowable harvest covered by the Pictou Landing Forestry Management Plan was presented, with details of proposed wood-chippers for the creation of biomass. It was concluded that the woodchip combustion system is economically preferable to the

  11. Overview description of the base scenario derived from FEP analysis

    International Nuclear Information System (INIS)

    , subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. This report uses the conceptual models developed from the FEP analysis to present a description of the base scenario, in terms of the processes to be represented in detailed models. This report does not present an assessment of the base scenario, but rather seeks to provide a summary of those features, events and processes that should be represented, at an appropriate level of detail, within numerical models. The requirements for the development of appropriate models for representing the base scenario are described in an underlying report within the model development document suite. (author)

  12. Diagnostic markers of urothelial cancer based on DNA methylation analysis

    International Nuclear Information System (INIS)

    Early detection and risk assessment are crucial for treating urothelial cancer (UC), which is characterized by a high recurrence rate, and necessitates frequent and invasive monitoring. We aimed to establish diagnostic markers for UC based on DNA methylation. In this multi-center study, three independent sample sets were prepared. First, DNA methylation levels at CpG loci were measured in the training sets (tumor samples from 91 UC patients, corresponding normal-appearing tissue from these patients, and 12 normal tissues from age-matched bladder cancer-free patients) using the Illumina Golden Gate methylation assay to identify differentially methylated loci. Next, these methylated loci were validated by quantitative DNA methylation by pyrosequencing, using another cohort of tissue samples (Tissue validation set). Lastly, methylation of these markers was analyzed in the independent urine samples (Urine validation set). ROC analysis was performed to evaluate the diagnostic accuracy of these 12 selected markers. Of the 1303 CpG sites, 158 were hyper ethylated and 356 were hypo ethylated in tumor tissues compared to normal tissues. In the panel analysis, 12 loci showed remarkable alterations between tumor and normal samples, with 94.3% sensitivity and 97.8% specificity. Similarly, corresponding normal tissue could be distinguished from normal tissues with 76.0% sensitivity and 100% specificity. Furthermore, the diagnostic accuracy for UC of these markers determined in urine samples was high, with 100% sensitivity and 100% specificity. Based on these preliminary findings, diagnostic markers based on differential DNA methylation at specific loci can be useful for non-invasive and reliable detection of UC and epigenetic field defect

  13. Seismic analysis of base-isolated liquid storage tanks

    Science.gov (United States)

    Shrimali, M. K.; Jangid, R. S.

    2004-08-01

    Three analytical studies for the seismic response of base-isolated ground supported cylindrical liquid storage tanks under recorded earthquake ground motion are presented. The continuous liquid mass of the tank is modelled as lumped masses referred as sloshing mass, impulsive mass and rigid mass. Firstly, the seismic response of isolated tanks is obtained using the modal superposition technique and compared with the exact response to study the effects of non-classical damping. The comparison of results with different tank aspect ratios and stiffness and damping of the bearing indicate that the effects of non-classical damping are insignificant implying that the response of isolated liquid storage tanks can be accurately obtained by the modal analysis with classical damping approximation. The second investigation involves the analysis of base-isolated liquid storage tanks using the response spectrum method in which the peak response of tank in different modes is obtained for the specified response spectrum of earthquake motion and combined with different combination rules. The results indicate that the peak response obtained by the response spectrum method matches well with the corresponding exact response. However, specific combination rule should be used for better estimation of various response quantities of the isolated tanks. Finally, the closed-form expressions for the modal parameters of the base-isolated liquid storage tanks are derived and compared with the exact values. A simplified approximate method is also proposed to evaluate the seismic response of isolated tanks. The response obtained from the above approximate method was found to be in good agreement with the exact response.

  14. Dynamic chest image analysis: model-based pulmonary perfusion analysis with pyramid images

    Science.gov (United States)

    Liang, Jianming; Haapanen, Arto; Jaervi, Timo; Kiuru, Aaro J.; Kormano, Martti; Svedstrom, Erkki; Virkki, Raimo

    1998-07-01

    The aim of the study 'Dynamic Chest Image Analysis' is to develop computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected at different phases of the respiratory/cardiac cycles in a short period of time. We have proposed a framework for ventilation study with an explicit ventilation model based on pyramid images. In this paper, we extend the framework to pulmonary perfusion study. A perfusion model and the truncated pyramid are introduced. The perfusion model aims at extracting accurate, geographic perfusion parameters, and the truncated pyramid helps in understanding perfusion at multiple resolutions and speeding up the convergence process in optimization. Three cases are included to illustrate the experimental results.

  15. Web Based Image Retrieval System Using Color, Texture and Shape Analysis: Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Amol P Bhagat

    2013-09-01

    Full Text Available The internet is one of the best media to disseminate scientific and technological research results [1, 2, 6]. It deals with the implementation of a web-based extensible architecture that is easily integral with applications written in different languages and linkable with different data sources. This paper work deals with developing architecture which is expandable and modular; its client–server functionalities permit easily building web applications that can be run using any Internet browser without compatibility problems regarding platform, program and operating system installed. This paper presents the implementation of Content Based Image Retrieval using different methods of color, texture and shape analysis. The primary objective is to compare the different methods of image analysis.

  16. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  17. Lunar base thermal management/power system analysis and design

    Science.gov (United States)

    Mcghee, Jerry R.

    1992-01-01

    A compilation of several lunar surface thermal management and power system studies completed under contract and IR&D is presented. The work includes analysis and preliminary design of all major components of an integrated thermal management system, including loads determination, active internal acquisition and transport equipment, external transport systems (active and passive), passive insulation, solar shielding, and a range of lunar surface radiator concepts. Several computer codes were utilized in support of this study, including RADSIM to calculate radiation exchange factors and view factors, RADIATOR (developed in-house) for heat rejection system sizing and performance analysis over a lunar day, SURPWER for power system sizing, and CRYSTORE for cryogenic system performance predictions. Although much of the work was performed in support of lunar rover studies, any or all of the results can be applied to a range of surface applications. Output data include thermal loads summaries, subsystem performance data, mass, and volume estimates (where applicable), integrated and worst-case lunar day radiator size/mass and effective sink temperatures for several concepts (shielded and unshielded), and external transport system performance estimates for both single and two-phase (heat pumped) transport loops. Several advanced radiator concepts are presented, along with brief assessments of possible system benefits and potential drawbacks. System point designs are presented for several cases, executed in support of the contract and IR&D studies, although the parametric nature of the analysis is stressed to illustrate applicability of the analysis procedure to a wide variety of lunar surface systems. The reference configuration(s) derived from the various studies will be presented along with supporting criteria. A preliminary design will also be presented for the reference basing scenario, including qualitative data regarding TPS concerns and issues.

  18. Meta-Analysis of Soybean-based Biodiesel.

    Science.gov (United States)

    Sieverding, Heidi L; Bailey, Lisa M; Hengen, Tyler J; Clay, David E; Stone, James J

    2015-07-01

    Biofuel policy changes in the United States have renewed interest in soybean [ (L.) Merr.] biodiesel. Past studies with varying methodologies and functional units can provide valuable information for future work. A meta-analysis of nine peer-reviewed soybean life cycle analysis (LCA) biodiesel studies was conducted on the northern Great Plains in the United States. Results of LCA studies were assimilated into a standardized system boundary and functional units for global warming (GWP), eutrophication (EP), and acidification (AP) potentials using biodiesel conversions from peer-reviewed and government documents. Factors not fully standardized included variations in NO accounting, mid- or end-point impacts, land use change, allocation, and statistical sampling pools. A state-by-state comparison of GWP lower and higher heating values (LHV, HHV) showed differences attributable to variations in spatial sampling and agricultural practices (e.g., tillage, irrigation). The mean GWP of LHV was 21.1 g·CO-eq MJ including outliers, and median EP LHV and AP LHV was 0.019 g·PO-eq MJ and 0.17 g·SO-eq MJ, respectively, using the limited data available. An LCA case study of South Dakota soybean-based biodiesel production resulted in GWP estimates (29 or 31 g·CO-eq MJ; 100% mono alkyl esters [first generation] biodiesel or 100% fatty acid methyl ester [second generation] biodiesel) similar to meta-analysis results (30.1 g·CO-eq MJ). Meta-analysis mean results, including outliers, resemble the California Low Carbon Fuel Standard for soybean biodiesel default value without land use change of 21.25 g·CO-eq MJ. Results were influenced by resource investment differences in water, fertilizer (e.g., type, application), and tillage. Future biofuel LCA studies should include these important factors to better define reasonable energy variations in regional agricultural management practices. PMID:26437085

  19. Analysis on electric energy measuring method based on multi-resolution analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-bing; CUI Jia-rui; LIANG Yuan-hua; WANG Mu-kun

    2006-01-01

    Along with the massive applications of the non-linear loads and the impact loads, many non-stationary stochastic signals such as harmonics, inter-harmonics, impulse signals and so on are introduced into the electric network, and these non-stationary stochastic signals have had effects on the accuracy of the measurement of electric energy. The traditional method like Fourier Analysis can be applied efficiently on the stationary stochastic signals, but it has little effect on non-stationary stochastic signals. In light of this, the form of the signals of the electric network in wavelet domain will be discussed in this paper. A measurement method of active power based on multi-resolution analysis in the stochastic process is presented. This method has a wider application scope compared with the traditional method Fourier analysis, and it is of good referential value and practical value in terms of raising the level of the existing electric energy measurement.

  20. Fission-track dating using object-based image analysis

    International Nuclear Information System (INIS)

    Full text: Geological dating with the help of fission track analysis is based on a time-consuming counting of the spontaneous and induced tracks in the minerals. Fission tracks are damage trails in minerals caused by fast charged particles, released in nuclear fission. In this study the 950;-method is used for fission-track dating. In order to determine the age, spontaneous tracks in the apatite and induced tracks in the muscovite external detector have to be counted. The automatic extraction and identification would not only improve the speed of track counting and eliminate the personal factor. Pixel values alone are not enough to distinguish between tracks and background. Traditional pixel based approaches are therefore inefficient for fission track counting. Image analysis based on objects, which include shape, texture and contextual information is a more promising method. A procedure for automatic object - based classification is used to extract the track objects. Resolving the individual tracks in a multi-track object is based on morphological operations. The individual track objects are skeletonized and the number of individual tracks in the object is counted by processing the skeletons. To give the right fission track age, there has to be a calibration of every single user manually counting the tracks. We calibrate the automatic approach for counting in the same way. Durango apatite standard samples are used to determine the 950;- and Z-calibration factor. The automatic approach is useful for counting tracks in apatite standards and induced tracks in muscovite external detectors where the quality and quantities of the etched tracks is high. Muscovite detectors irradiated against glasses can also be used to determine the thermal neutron fluence, which is necessary to determine an absolute age. These images are of high quality and free of disturbing background irregularities. Here the automatic approach is a practical alternative. However for natural samples

  1. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  2. Hierarchical structure analysis describing abnormal base composition of genomes

    Science.gov (United States)

    Ouyang, Zhengqing; Liu, Jian-Kun; She, Zhen-Su

    2005-10-01

    Abnormal base compositional patterns of genomic DNA sequences are studied in the framework of a hierarchical structure (HS) model originally proposed for the study of fully developed turbulence [She and Lévêque, Phys. Rev. Lett. 72, 336 (1994)]. The HS similarity law is verified over scales between 103bp and 105bp , and the HS parameter β is proposed to describe the degree of heterogeneity in the base composition patterns. More than one hundred bacteria, archaea, virus, yeast, and human genome sequences have been analyzed and the results show that the HS analysis efficiently captures abnormal base composition patterns, and the parameter β is a characteristic measure of the genome. Detailed examination of the values of β reveals an intriguing link to the evolutionary events of genetic material transfer. Finally, a sequence complexity (S) measure is proposed to characterize gradual increase of organizational complexity of the genome during the evolution. The present study raises several interesting issues in the evolutionary history of genomes.

  3. A practical approach to object based requirements analysis

    Science.gov (United States)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  4. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  5. X-ray Rietveld analysis with a physically based background

    International Nuclear Information System (INIS)

    On the basis of known equations for calculating X-ray diffraction intensities from a given number of unit cells of a crystal phase in polycrystalline material, as due to: (i) Bragg reflections; (ii) average diffuse scattering caused by thermal plus first-kind disorder; and (iii) incoherent scattering, a relationship has been found that ties, in the Rietveld analysis, the Bragg scale factor to a scale factor for 'disorder' as well as incoherent scattering. Instead of fitting the background with a polynomial function, it becomes possible to describe the background by physically based equations. Air scattering is included in the background simulation. By this means, the refinement can be carried out with fewer parameters (six fewer than when a fifth-order polynomial is used). The DBWS-9006PC computer program written by Sakthivel and Young [(1990), Georgia Institute of Technology, Atlanta, GA, USA] has been modified to follow this approach and it has been used to refine the crystal structures of the cubic form of Y2O3 and of α-Al2O3. Peak asymmetry has been described by a function based on an exponential approximation. The results from refinements using polynomial physically based background function are, in terms of final structural parameters and reliability indices, very close to each other and in agreement with results reported in the literature. The reconstruction and optimization of the background scattering by means of physically based equations helps the implementation in the Rietveld code of other possible specific diffuse scattering contributions, such as that due to an amorphous phase. (orig.)

  6. Rasch model based analysis of the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Maja Planinic

    2010-03-01

    Full Text Available The Force Concept Inventory (FCI is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct. The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17–18 years. The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4%, indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian, an additional predominantly Newtonian sample (N=141, average FCI score of 64.5% of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further

  7. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  8. First law-based thermodynamic analysis on Kalina cycle

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the first law of thermodynamics,and adopting the Peng-Robinson equation (P-R equation) as the basic equation for the properties of ammonia-water mixtures,a thermodynamic analysis on a single-stage distillation Kalina cycle is presented.A program to calculate the thermodynamic properties of ammoniawater mixtures,and that for calculating the performance of Kalina cycles,were developed,with which the heatwork conversion particulars of Kalina cycles were theoretically calculated.The influences on the cycle performance of key parameters,such as the pressure and temperature at the inlet of the turbine,the back pressure of the turbine,the concentration of the working solution,the concentration of the basic solution and the cycle multiplication ratio,were analyzed.

  9. Windows Volatile Memory Forensics Based on Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2014-03-01

    Full Text Available In this paper, we present an integrated memory forensic solution for multiple Windows memory images. By calculation, the method can find out the correlation degree among the processes of volatile memory images and the hidden clues behind the events of computers, which is usually difficult to be obtained and easily ignored by analyzing one single memory image and forensic investigators. In order to test the validity, we performed an experiment based on two hosts' memory image which contains criminal incidents. According to the experimental result, we find that the event chains reconstructed by our method are similar to the actual actions in the criminal scene. Investigators can review the digital crime scenario which is contained in the data set by analyzing the experimental results. This paper is aimed at finding the valid actions with illegal attempt and making the memory analysis not to be utterly dependent on the operating system and relevant experts.

  10. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  11. Identification and annotation of erotic film based on content analysis

    Science.gov (United States)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  12. Visual traffic jam analysis based on trajectory data.

    Science.gov (United States)

    Wang, Zuchao; Lu, Min; Yuan, Xiaoru; Zhang, Junping; van de Wetering, Huub

    2013-12-01

    In this work, we present an interactive system for visual analysis of urban traffic congestion based on GPS trajectories. For these trajectories we develop strategies to extract and derive traffic jam information. After cleaning the trajectories, they are matched to a road network. Subsequently, traffic speed on each road segment is computed and traffic jam events are automatically detected. Spatially and temporally related events are concatenated in, so-called, traffic jam propagation graphs. These graphs form a high-level description of a traffic jam and its propagation in time and space. Our system provides multiple views for visually exploring and analyzing the traffic condition of a large city as a whole, on the level of propagation graphs, and on road segment level. Case studies with 24 days of taxi GPS trajectories collected in Beijing demonstrate the effectiveness of our system. PMID:24051782

  13. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  14. Fingerprint image segmentation based on multi-features histogram analysis

    Science.gov (United States)

    Wang, Peng; Zhang, Youguang

    2007-11-01

    An effective fingerprint image segmentation based on multi-features histogram analysis is presented. We extract a new feature, together with three other features to segment fingerprints. Two of these four features, each of which is related to one of the other two, are reciprocals with each other, so features are divided into two groups. These two features' histograms are calculated respectively to determine which feature group is introduced to segment the aim-fingerprint. The features could also divide fingerprints into two classes with high and low quality. Experimental results show that our algorithm could classify foreground and background effectively with lower computational cost, and it can also reduce pseudo-minutiae detected and improve the performance of AFIS.

  15. Structural Optimization based on the Concept of First Order Analysis

    International Nuclear Information System (INIS)

    Computer Aided Engineering (CAE) has been successfully utilized in mechanical industries such as the automotive industry. It is, however, difficult for most mechanical design engineers to directly use CAE due to the sophisticated nature of the operations involved. In order to mitigate this problem, a new type of CAE, First Order Analysis (FOA) has been proposed. This paper presents the outcome of research concerning the development of a structural topology optimization methodology within FOA. This optimization method is constructed based on discrete and function-oriented elements such as beam and panel elements, and sequential convex programming. In addition, examples are provided to show the utility of the methodology presented here for mechanical design engineers

  16. Performance analysis of charge plasma based dual electrode tunnel FET

    Science.gov (United States)

    Anand, Sunny; Intekhab Amin, S.; Sarin, R. K.

    2016-05-01

    This paper proposes the charge plasma based dual electrode doping-less tunnel FET (DEDLTFET). The paper compares the device performance of the conventional doping-less TFET (DLTFET) and doped TFET (DGTFET). DEDLTEFT gives the superior results with high ON state current (ION ∼ 0.56 mA/μm), ION/IOFF ratio ∼ 9.12 × 1013 and an average subthreshold swing (AV-SS ∼ 48 mV/dec). The variation of different device parameters such as channel length, gate oxide material, gate oxide thickness, silicon thickness, gate work function and temperature variation are done and compared with DLTFET and DGTFET. Through the extensive analysis it is found that DEDLTFET shows the better performance than the other two devices, which gives the indication for an excellent future in low power applications.

  17. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  18. Vocoder analysis based on properties of the human auditory system

    Science.gov (United States)

    Gold, B.; Tierney, J.

    1983-12-01

    When a person listens to speech corrupted by noise or other adverse environmental factors, speech intelligibility may be impaired slightly or not at all. The same corrupted speech, after being vocoded, often causes drastic intelligibility loss. The loss is due to the fact that the human peripheral auditory system is a superior signal processor to that of the vocoder. This report is based on the premise that a vocoder analyzer that better resembles the peripheral auditory system would function in a superior manner to present-day vocoders. Topics include reviews of speech enhancement techniques, perceptual analysis of diagnostic rhyme test data, a brief description of the peripheral auditory system and an outline of proposed psychophysical tests. The final section is devoted to a discussion of some preliminary work on computer simulation of an auditory model.

  19. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  20. Analysis of equivalent antenna based on FDTD method

    Institute of Scientific and Technical Information of China (English)

    Yun-xing YANG; Hui-chang ZHAO; Cui DI

    2014-01-01

    An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD) method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is air)takes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  1. Architecture Analysis of an FPGA-Based Hopfield Neural Network

    Directory of Open Access Journals (Sweden)

    Miguel Angelo de Abreu de Sousa

    2014-01-01

    Full Text Available Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.

  2. Pressure Control in Distillation Columns: A Model-Based Analysis

    DEFF Research Database (Denmark)

    Mauricio Iglesias, Miguel; Bisgaard, Thomas; Kristensen, Henrik;

    2014-01-01

    A comprehensive assessment of pressure control in distillation columns is presented, including the consequences for composition control and energy consumption. Two types of representative control structures are modeled, analyzed, and benchmarked. A detailed simulation test, based on a real...... industrial distillation column, is used to assess the differences between the two control structures and to demonstrate the benefits of pressure control in the operation. In the second part of the article, a thermodynamic analysis is carried out to establish the influence of pressure on relative volatility...... differences. Depending on the sensitivity of relative volatility to pressure, it is shown that controlling the bottom-tray pressure instead of the top-tray pressure leads to operation at the minimum possible average column pressure, so that significant energy savings can be achieved....

  3. Analysis of equivalent antenna based on FDTD method

    Directory of Open Access Journals (Sweden)

    Yun-xing Yang

    2014-09-01

    Full Text Available An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is airtakes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  4. SILAC-based comparative analysis of pathogenic Escherichia coli secretomes

    DEFF Research Database (Denmark)

    Boysen, Anders; Borch, Jonas; Krogh, Thøger Jensen;

    2015-01-01

    this study, we grew the pathogenic strains ETEC H10407, AIEC LF82 and the non-pathogenic reference strain E. coli K-12 MG1655 in parallel and used SILAC to compare protein levels in OMVs and culture supernatant. We have identified well-known virulence factors from both AIEC and ETEC, thus validating......Comparative studies of pathogenic bacteria and their non-pathogenic counterparts has led to the discovery of important virulence factors thereby generating insight into mechanisms of pathogenesis. Protein-based antigens for vaccine development are primarily selected among unique virulence...... proteome analysis have the potential to discover both classes of proteins and hence form an important tool for discovering therapeutic targets. Adherent-invasive Escherichia coli (AIEC) and Enterotoxigenic E. coli (ETEC) are pathogenic variants of E. coli which cause intestinal disease in humans. AIEC is...

  5. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  6. Diversity analysis for Magnaporthe grisea by Pot2_based PCR

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    @@CO39, its near_isogenic lines, having single blast resistance gene, C101LAC, C101A51, C104PKT, and C101PKT, and its resistance gene pyramid lines BL121, BL241, and A57_119, grew in the blast nursery at IRRI. The seeds were sown in four batches with two weeks interval between batches, using IR50 and IR72 as spread rows and susceptible controls. For moist leaf, spraying water was proceeded every two hours from 8:30 am in sunny days. Blast disease was scored and pathogen was isolated every two weeks. DNA samples of 310 isolates were used for diversity analysis by Pot 2_based PCR (Pot 2, a dispersed retrotransposon of the fungus).

  7. Quaternion-based discriminant analysis method for color face recognition.

    Science.gov (United States)

    Xu, Yong

    2012-01-01

    Pattern recognition techniques have been used to automatically recognize the objects, personal identities, predict the function of protein, the category of the cancer, identify lesion, perform product inspection, and so on. In this paper we propose a novel quaternion-based discriminant method. This method represents and classifies color images in a simple and mathematically tractable way. The proposed method is suitable for a large variety of real-world applications such as color face recognition and classification of the ground target shown in multispectrum remote images. This method first uses the quaternion number to denote the pixel in the color image and exploits a quaternion vector to represent the color image. This method then uses the linear discriminant analysis algorithm to transform the quaternion vector into a lower-dimensional quaternion vector and classifies it in this space. The experimental results show that the proposed method can obtain a very high accuracy for color face recognition. PMID:22937054

  8. Fuzzy MCDM Based on Fuzzy Relational Degree Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a new fuzzy multiple criteria (both qualitative and quantitative) decision-making (MCDM) method based on fuzzy relational degree analysis. The concepts of fuzzy set theory are used to construct a weighted suitability decision matrix to evaluate the weighted suitability of different alternatives versus various criteria. The positive ideal solution and negative ideal solution are then obtained by using a method of ranking fuzzy numbers, and the fuzzy relational degrees of different alternatives versus positive ideal solution and negative ideal solution are calculated by using the proposed arithmetic. Finally, the relative relational degrees of various alternatives versus positive ideal solution are ranked to determine the best alternative. A numerical example is provided to illustrate the proposed method at the end of this paper.

  9. Iris recognition based on robust principal component analysis

    Science.gov (United States)

    Karn, Pradeep; He, Xiao Hai; Yang, Shuai; Wu, Xiao Hong

    2014-11-01

    Iris images acquired under different conditions often suffer from blur, occlusion due to eyelids and eyelashes, specular reflection, and other artifacts. Existing iris recognition systems do not perform well on these types of images. To overcome these problems, we propose an iris recognition method based on robust principal component analysis. The proposed method decomposes all training images into a low-rank matrix and a sparse error matrix, where the low-rank matrix is used for feature extraction. The sparsity concentration index approach is then applied to validate the recognition result. Experimental results using CASIA V4 and IIT Delhi V1iris image databases showed that the proposed method achieved competitive performances in both recognition accuracy and computational efficiency.

  10. UNRAVELING ECOTOURISM PRACTICE:PROBLEM ANALYSIS BASED ON STAKEHOLDERS

    Institute of Scientific and Technical Information of China (English)

    LIU Xue-mei; BAO Ji-gang

    2004-01-01

    Despite the considerable literatures defined what Ecotourism is or should be, it is experiencing various practices with different features. Now the term "Ecotourism" is almost applied to all tourism activities which are based on nature. Faced to the flooding of those unqualified Ecotourism, it is of great necessity to put forward professional claim. The present writer holds that the key to the realization of rigorous Ecotourism chiefly lies in the relationships among the different interest groups involved in it. So the focus of this paper is just on giving a special analysis to the interest relations between those stakeholders which include local govemment, tour-operators, local residents and eco-tourists, and thus helping to find out what wrong is in those unqualified Ecotourism and the roots of those problems.

  11. Choosing a Commercial Broiler Strain Based on Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Hosseini SA

    2014-05-01

    Full Text Available With the complexity and amount of information in a wide variety of comparative performance reports in poultry production, making a decision is difficult. This problem is overcomed only when all data can be put into a common unit. For this purpose, five different decision making analysis approaches including  Maximin, Equally likely, Weighted average, Ordered weighted averages and Technique for order preference by similarity to ideal solution were used to choose the best broiler strain among three ones based on their comparative performance and carcass characteristics. Commercial broiler strains of 6000 designated as R, A, and C (each strain 2000 were randomly allocated into three treatments of five replicates. In this study, all methods showed similar results except Maximin approach. Comparing different methods indicated that strain C with the highest world share market has the best performance followed by strains R and A.

  12. A Frame-Based Analysis of Synaesthetic Metaphors

    Directory of Open Access Journals (Sweden)

    Hakan Beseoglu

    2008-08-01

    Full Text Available The aim of this paper is to use a frame-based account to explain some empirical findings regarding the accessibility of synaesthetic metaphors. Therefore, some results of empirical studies will be discussed with regard to the question of how much it matters whether the concept of the source domain in a synaesthetic metaphor is a scalar or a quality concept. Furthermore, typed frames are introduced, and it is explained how the notion of a minimal upper attribute can be used in the analysis of adjective-noun compounds. Finally, frames are used to analyze synaesthetic metaphors; it turns out that they offer an adequate basis for the explanation of different accessibility rates found in empirical studies.

  13. Retention failure analysis of metal-oxide based resistive memory

    Science.gov (United States)

    Choi, Shinhyun; Lee, Jihang; Kim, Sungho; Lu, Wei D.

    2014-09-01

    Resistive switching devices (RRAMs) have been proposed a promising candidate for future memory and neuromorphic applications. Central to the successful application of these emerging devices is the understanding of the resistance switching and failure mechanism, and identification of key physical parameters that will enable continued device optimization. In this study, we report detailed retention analysis of a TaOx based RRAM at high temperatures and the development of a microscopic oxygen diffusion model that fully explains the experimental results and can be used to guide future device developments. The device conductance in low resistance state (LRS) was constantly monitored at several elevated temperatures (above 300 °C), and an initial gradual conductivity drift followed by a sudden conductance drop were observed during retention failure. These observations were explained by a microscopic model based on oxygen vacancy diffusion, which quantitatively explains both the initial gradual conductance drift and the sudden conductance drop. Additionally, a non-monotonic conductance change, with an initial conductance increase followed by the gradual conductance decay over time, was observed experimentally and explained within the same model framework. Specifically, our analysis shows that important microscopic physical parameters such as the activation energy for oxygen vacancy migration can be directly calculated from the failure time versus temperature relationship. Results from the analytical model were further supported by detailed numerical multi-physics simulation, which confirms the filamentary nature of the conduction path in LRS and the importance of oxygen vacancy diffusion in device reliability. Finally, these high-temperature stability measurements also reveal the existence of multiple filaments in the same device.

  14. POSSIBILITY AND EVIDENCE-BASED RELIABILITY ANALYSIS AND DESIGN OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Hong-Zhong Huang

    2013-01-01

    Full Text Available Engineering design under uncertainty has gained considerable attention in recent years. A great multitude of new design optimization methodologies and reliability analysis approaches are put forth with the aim of accommodating various uncertainties. Uncertainties in practical engineering applications are commonly classified into two categories, i.e., aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises because of unpredictable variation in the performance and processes of systems, it is irreducible even adding more data or knowledge. On the other hand, epistemic uncertainty stems from lack of knowledge of the system due to limited data, measurement limitations, or simplified approximations in modeling system behavior and it can be reduced by obtaining more data or knowledge. More specifically, aleatory uncertainty is naturally represented by a statistical distribution and its associated parameters can be characterized by sufficient data. If, however, the data is limited and can be quantified in a statistical sense, epistemic uncertainty can be considered as an alternative tool in such a situation. Of the several optional treatments for epistemic uncertainty, possibility theory and evidence theory have proved to be the most computationally efficient and stable for reliability analysis and engineering design optimization. This study first attempts to provide a better understanding of uncertainty in engineering design by giving a comprehensive overview of its classifications, theories and design considerations. Then a review is conducted of general topics such as the foundations and applications of possibility theory and evidence theory. This overview includes the most recent results from theoretical research, computational developments and performance improvement of possibility theory and evidence theory with an emphasis on revealing the capability and characteristics of quantifying uncertainty from different perspectives

  15. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  16. Atomic force microscopy-based shape analysis of heart mitochondria.

    Science.gov (United States)

    Lee, Gi-Ja; Park, Hun-Kuk

    2015-01-01

    Atomic force microscopy (AFM) has become an important medical and biological tool for the noninvasive imaging of cells and biomaterials in medical, biological, and biophysical research. The major advantages of AFM over conventional optical and electron microscopes for bio-imaging include the facts that no special coating is required and that imaging can be done in all environments-air, vacuum, or aqueous conditions. In addition, it can also precisely determine pico-nano Newton force interactions between the probe tip and the sample surface from force-distance curve measurements.It is widely known that mitochondrial swelling is one of the most important indicators of the opening of the mitochondrial permeability transition (MPT) pore. As mitochondrial swelling is an ultrastructural change, quantitative analysis of this change requires high-resolution microscopic methods such as AFM. Here, we describe the use of AFM-based shape analysis for the characterization of nanostructural changes in heart mitochondria resulting from myocardial ischemia-reperfusion injury. PMID:25634291

  17. Aroma characterization based on aromatic series analysis in table grapes

    Science.gov (United States)

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-01-01

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, ‘Kyoho’ grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes. PMID:27487935

  18. Validation test case generation based on safety analysis ontology

    International Nuclear Information System (INIS)

    Highlights: ► Current practice in validation test case generation for nuclear system is mainly ad hoc. ► This study designs a systematic approach to generate validation test cases from a Safety Analysis Report. ► It is based on a domain-specific ontology. ► Test coverage criteria have been defined and satisfied. ► A computerized toolset has been implemented to assist the proposed approach. - Abstract: Validation tests in the current nuclear industry practice are typically performed in an ad hoc fashion. This study presents a systematic and objective method of generating validation test cases from a Safety Analysis Report (SAR). A domain-specific ontology was designed and used to mark up a SAR; relevant information was then extracted from the marked-up document for use in automatically generating validation test cases that satisfy the proposed test coverage criteria; namely, single parameter coverage, use case coverage, abnormal condition coverage, and scenario coverage. The novelty of this technique is its systematic rather than ad hoc test case generation from a SAR to achieve high test coverage.

  19. Web-based analysis of the mouse transcriptome using Genevestigator

    Directory of Open Access Journals (Sweden)

    Gruissem Wilhelm

    2006-06-01

    Full Text Available Abstract Background Gene function analysis often requires a complex and laborious sequence of laboratory and computer-based experiments. Choosing an effective experimental design generally results from hypotheses derived from prior knowledge or experimentation. Knowledge obtained from meta-analyzing compendia of expression data with annotation libraries can provide significant clues in understanding gene and network function, resulting in better hypotheses that can be tested in the laboratory. Description Genevestigator is a microarray database and analysis system allowing context-driven queries. Simple but powerful tools allow biologists with little computational background to retrieve information about when, where and how genes are expressed. We manually curated and quality-controlled 3110 mouse Affymetrix arrays from public repositories. Data queries can be run against an annotation library comprising 160 anatomy categories, 12 developmental stage groups, 80 stimuli, and 182 genetic backgrounds or modifications. The quality of results obtained through Genevestigator is illustrated by a number of biological scenarios that are substantiated by other types of experimentation in the literature. Conclusion The Genevestigator-Mouse database effectively provides biologically meaningful results and can be accessed at https://www.genevestigator.ethz.ch.

  20. Series Arc Fault Detection Algorithm Based on Autoregressive Bispectrum Analysis

    Directory of Open Access Journals (Sweden)

    Kai Yang

    2015-10-01

    Full Text Available Arc fault is one of the most critical reasons for electrical fires. Due to the diversity, randomness and concealment of arc faults in low-voltage circuits, it is difficult for general methods to protect all loads from series arc faults. From the analysis of many series arc faults, a large number of high frequency signals generated in circuits are found. These signals are easily affected by Gaussian noise which is difficult to be eliminated as a result of frequency aliasing. Thus, a novel detection algorithm is developed to accurately detect series arc faults in this paper. Initially, an autoregressive model of the mixed high frequency signals is modelled. Then, autoregressive bispectrum analysis is introduced to analyze common series arc fault features. The phase information of arc fault signal is preserved using this method. The influence of Gaussian noise is restrained effectively. Afterwards, several features including characteristic frequency, fluctuation of phase angles, diffused distribution and incremental numbers of bispectrum peaks are extracted for recognizing arc faults. Finally, least squares support vector machine is used to accurately identify series arc faults from the load states based on these frequency features of bispectrum. The validity of the algorithm is experimentally verified obtaining arc fault detection rate above 97%.