Model-free functional MRI analysis using cluster-based methods
Otto, Thomas D.; Meyer-Baese, Anke; Hurdal, Monica; Sumners, DeWitt; Auer, Dorothee; Wismuller, Axel
2003-08-01
Conventional model-based or statistical analysis methods for functional MRI (fMRI) are easy to implement, and are effective in analyzing data with simple paradigms. However, they are not applicable in situations in which patterns of neural response are complicated and when fMRI response is unknown. In this paper the "neural gas" network is adapted and rigorously studied for analyzing fMRI data. The algorithm supports spatial connectivity aiding in the identification of activation sites in functional brain imaging. A comparison of this new method with Kohonen's self-organizing map and with a minimal free energy vector quantizer is done in a systematic fMRI study showing comparative quantitative evaluations. The most important findings in this paper are: (1) the "neural gas" network outperforms the other two methods in terms of detecting small activation areas, and (2) computed reference function several that the "neural gas" network outperforms the other two methods. The applicability of the new algorithm is demonstrated on experimental data.
Model-free model elimination: A new step in the model-free dynamic analysis of NMR relaxation data
Model-free analysis is a technique commonly used within the field of NMR spectroscopy to extract atomic resolution, interpretable dynamic information on multiple timescales from the R1, R2, and steady state NOE. Model-free approaches employ two disparate areas of data analysis, the discipline of mathematical optimisation, specifically the minimisation of a χ2 function, and the statistical field of model selection. By searching through a large number of model-free minimisations, which were setup using synthetic relaxation data whereby the true underlying dynamics is known, certain model-free models have been identified to, at times, fail. This has been characterised as either the internal correlation times, τe, τf, or τs, or the global correlation time parameter, local τm, heading towards infinity, the result being that the final parameter values are far from the true values. In a number of cases the minimised χ2 value of the failed model is significantly lower than that of all other models and, hence, will be the model which is chosen by model selection techniques. If these models are not removed prior to model selection the final model-free results could be far from the truth. By implementing a series of empirical rules involving inequalities these models can be specifically isolated and removed. Model-free analysis should therefore consist of three distinct steps: model-free minimisation, model-free model elimination, and finally model-free model selection. Failure has also been identified to affect the individual Monte Carlo simulations used within error analysis. Each simulation involves an independent randomised relaxation data set and model-free minimisation, thus simulations suffer from exactly the same types of failure as model-free models. Therefore, to prevent these outliers from causing a significant overestimation of the errors the failed Monte Carlo simulations need to be culled prior to calculating the parameter standard deviations
Vesselinov, V. V.; Alexandrov, B.
2014-12-01
The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the
Dopamine enhances model-based over model-free choice behavior.
Wunderlich, K; Smittenaar, P.; Dolan, R J
2012-01-01
Summary Decision making is often considered to arise out of contributions from a model-free habitual system and a model-based goal-directed system. Here, we investigated the effect of a dopamine manipulation on the degree to which either system contributes to instrumental behavior in a two-stage Markov decision task, which has been shown to discriminate model-free from model-based control. We found increased dopamine levels promote model-based over model-free choice.
Simon Benjaminsson
2010-08-01
Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.
Yan, Qi-Long, E-mail: terry.well@163.com [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zeman, Svatopluk, E-mail: svatopluk.zeman@upce.cz [Institute of Energetic Materials, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic); Sánchez Jiménez, P.E. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Zhao, Feng-Qi [Science and Technology on Combustion and Explosion Laboratory, Xi’an Modern Chemistry Research Institute, 710065 Xi’an (China); Pérez-Maqueda, L.A. [Instituto de Ciencia de Materiales de Sevilla, CSIC-Universidad de Sevilla, C. Américo Vespucio No. 49, 41092 Sevilla (Spain); Málek, Jiří [Department of Physical Chemistry, Faculty of Chemical Technology, University of Pardubice, 53210 Pardubice (Czech Republic)
2014-04-01
Highlights: • Nonisothermal decomposition kinetics of RDX and its PBXs has been investigated. • The kinetic models are determined by both master plot and combined kinetic analysis methods. • The constant rate temperature profiles and isothermal curves are predicted by obtained kinetic triplets. • The storage safety parameters are simulated based on thermal explosion theory. - Abstract: In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82 °C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami–Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82 °C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature.
Yan, Qi-Long; Zeman, Svatopluk; Sánchez Jiménez, P E; Zhao, Feng-Qi; Pérez-Maqueda, L A; Málek, Jiří
2014-04-30
In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82°C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami-Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82°C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature. PMID:24657941
Application of model-free kinetics to the study of dehydration of fly ash-based zeolite
In the present paper, dehydration kinetics of zeolite Na-A synthesized from fly ash was investigated by means of thermogravimetric analysis. Na-A zeolite was formed from coal fly fash by fusion with sodium hydroxide and succeeding hydrothermal treatment at 100 deg. C after induction period. The model-free kinetic method was applied to calculate the activation energy of the dehydration process of fly ash-based zeolite as a function of conversion and temperature. The Vyazovkin model-free kinetic method also enabled the definition of time, necessary to remove water molecules from the zeolite structure for a given temperature
The use of model selection in the model-free analysis of protein dynamics
Model-free analysis of NMR relaxation data, which is widely used for the study of protein dynamics, consists of the separation of the global rotational diffusion from internal motions relative to the diffusion frame and the description of these internal motions by amplitude and timescale. Five model-free models exist, each of which describes a different type of motion. Model-free analysis requires the selection of the model which best describes the dynamics of the NH bond. It will be demonstrated that the model selection technique currently used has two significant flaws, under-fitting, and not selecting a model when one ought to be selected. Under-fitting breaks the principle of parsimony causing bias in the final model-free results, visible as an overestimation of S2 and an underestimation of τe and Rex. As a consequence the protein falsely appears to be more rigid than it actually is. Model selection has been extensively developed in other fields. The techniques known as Akaike's Information Criteria (AIC), small sample size corrected AIC (AICc), Bayesian Information Criteria (BIC), bootstrap methods, and cross-validation will be compared to the currently used technique. To analyse the variety of techniques, synthetic noisy data covering all model-free motions was created. The data consists of two types of three-dimensional grid, the Rex grids covering single motions with chemical exchange {S2,τe,Rex}, and the Double Motion grids covering two internal motions {Sf2,Ss2,τs}. The conclusion of the comparison is that for accurate model-free results, AIC model selection is essential. As the method neither under, nor over-fits, AIC is the best tool for applying Occam's razor and has the additional benefits of simplifying and speeding up model-free analysis
Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code
Zhang, Yunqian; Chen, Zhe; Cheng, Ming;
2011-01-01
As the wind turbine is a nonlinear high-order system, to achieve good pitch control performance, model free adaptive control (MFAC) approach which doesn't need the mathematical model of the wind turbine is adopted in the pitch control system in this paper. A pseudo gradient vector whose estimation...... value is only based on I/O data of the wind turbine is identified and then the wind turbine system is replaced by a dynamic linear time-varying model. In order to verify the correctness and robustness of the proposed model free adaptive pitch controller, the wind turbine code FAST which can predict the...
Konovalov, Arkady; Krajbich, Ian
2016-01-01
Organisms appear to learn and make decisions using different strategies known as model-free and model-based learning; the former is mere reinforcement of previously rewarded actions and the latter is a forward-looking strategy that involves evaluation of action-state transition probabilities. Prior work has used neural data to argue that both model-based and model-free learners implement a value comparison process at trial onset, but model-based learners assign more weight to forward-looking computations. Here using eye-tracking, we report evidence for a different interpretation of prior results: model-based subjects make their choices prior to trial onset. In contrast, model-free subjects tend to ignore model-based aspects of the task and instead seem to treat the decision problem as a simple comparison process between two differentially valued items, consistent with previous work on sequential-sampling models of decision making. These findings illustrate a problem with assuming that experimental subjects make their decisions at the same prescribed time. PMID:27511383
Anya eSkatova
2013-09-01
Full Text Available Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, versus another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans’ scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.
Deborah K Hill
Full Text Available Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized (13C metabolic imaging in humans, where measurement of the input function can be problematic.
Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems
Zhe Dong
2016-01-01
Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.
Model-based and model-free “plug-and-play” building energy efficient control
Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution
Model-free prediction and regression a transformation-based approach to inference
Politis, Dimitris N
2015-01-01
The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...
Hayward, S.; Kitao, A.; Berendsen, H.J.C.
1997-01-01
Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by m
A Model-free Approach to Fault Detection of Continuous-time Systems Based on Time Domain Data
Ping Zhang; Steven X. Ding
2007-01-01
In this paper, a model-free approach is presented to design an observer-based fault detection system of linear continuoustime systems based on input and output data in the time domain. The core of the approach is to directly identify parameters of the observer-based residual generator based on a numerically reliable data equation obtained by filtering and sampling the input and output signals.
Landmark-based model-free 3D face shape reconstruction from video sequences
Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk; Broemme, A.; Busch, C.
2013-01-01
In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence potentially useful video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm b
Skatova, Anya; Chan, Patricia A.; Daw, Nathaniel D.
2013-01-01
Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs....
Anya eSkatova; Patricia Angie Chan; Daw, Nathaniel D.
2013-01-01
Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, ver...
Model-free functional MRI analysis using improved fuzzy cluster analysis techniques
Lange, Oliver; Meyer-Baese, Anke; Wismueller, Axel; Hurdal, Monica; Sumners, DeWitt; Auer, Dorothee
2004-04-01
Conventional model-based or statistical analysis methods for functional MRI (fMRI) are easy to implement, and are effective in analyzing data with simple paradigms. However, they are not applicable in situations in which patterns of neural response are complicated and when fMRI response is unknown. In this paper the Gath-Geva algorithm is adapted and rigorously studied for analyzing fMRI data. The algorithm supports spatial connectivity aiding in the identification of activation sites in functional brain imaging. A comparison of this new method with the fuzzy n-means algorithm, Kohonen's self-organizing map, fuzzy n-means algorithm with unsupervised initialization, minimal free energy vector quantizer and the "neural gas" network is done in a systematic fMRI study showing comparative quantitative evaluations. The most important findings in the paper are: (1) the Gath-Geva algorithms outperforms for a large number of codebook vectors all other clustering methods in terms of detecting small activation areas, and (2) for a smaller number of codebook vectors the fuzzy n-means with unsupervised initialization outperforms all other techniques. The applicability of the new algorithm is demonstrated on experimental data.
Salehi, Mehdi; Clemens, Frank; Graule, Thomas; Grobéty, Bernard
2012-01-01
Polymeric binder burnout during thermoplastic processing of yttria stabilized zirconia (YSZ) ceramics were analyzed using thermogravimetric analysis (TGA). The debinding kinetics of the stearic acid/polystyrene binder have been described using model free methods and compared with the decomposition rate of the pure polymers. The apparent activation energy Eα as a function of debinding progress α was calculated in two atmospheres (argon and air) by three different methods: Ozawa–Flynn–Wall (OFW...
田宜水; 王茹
2016-01-01
为研究典型生物质热动力学，判断反应机理，获得反应的动力学速率参数，该文采用热重分析技术对玉米秸秆、小麦秸秆、棉秆、松树木屑、花生壳、甜高粱渣等生物质原料进行了氮气气氛下不同升温速率的热解特性试验研究，利用Friedman法、Flynn-Wall-Ozawa法计算活化能，用Malek法确定最概然机理函数，建立了生物质热分析动力学模型，并讨论了不同生物质的差异性。结果表明：生物质的热解过程均包括3个主要阶段：干燥预热阶段、挥发分析出阶段、碳化阶段。典型生物质活化能随着转化率的增加而增加，在挥发分析出阶段，热解活化能介于144.61～167.34 kJ/mol之间；反应动力学机理均符合Avrami-Erofeev函数，但反应级数有一定的差异；指前因子介于26.66～33.97 s-1之间。这为生物质热化学转化过程工艺条件的优化及工程放大提供理论依据。%Thermokinetics analysis can test the relationship between physical and chemical properties of material and temperature through controlling heating rate. Through thermokinetics analysis, we can study the combustion, pyrolysis and gasification reaction kinetics of biomass, decide the reaction kinetics model and calculate the reaction kinetics parameters, such as activation energy and pre-exponential factor. In the article, we chose 6 kinds of biomass raw materials, including corn straw, wheat straw, cotton stalk, pine sawdust, peanut shell, and residue of sweet sorghum. The thermal gravity analysis (TG) experiments were carried out, and 8 loss curves were obtained under non-isothermal conditions at linear heating rate of 5, 10, 20 and 30℃/min. The 99.99% nitrogen continuously passed and the temperature rose from room temperature to 600℃. The initial sample weight was always within the range of 3-4 mg. The method of different heating rates was applied to non-isothermal data. The Friedman method and the
Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis
Félix A. LÓPEZ
2013-12-01
Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947
Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis
Félix A. LÓPEZ
2013-12-01
Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made. DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947
Model-free analysis of quadruply imaged gravitationally lensed systems and substructured galaxies
Woldesenbet, Addishiwot Girma
2015-01-01
Multiple image gravitational lens systems, and especially quads are invaluable in determining the amount and distribution of mass in galaxies. This is usually done by mass modeling using parametric or free-form methods. An alternative way of extracting information about lens mass distribution is to use lensing degeneracies and invariants. Where applicable, they allow one to make conclusions about whole classes of lenses without model fitting. Here, we use approximate, but observationally useful invariants formed by the three relative polar angles of quad images around the lens center to show that many smooth elliptical+shear lenses can reproduce the same set of quad image angles within observational error. This result allows us to show in a model-free way what the general class of smooth elliptical+shear lenses looks like in the three dimensional (3D) space of image relative angles, and that this distribution does not match that of the observed quads. We conclude that, even though smooth elliptical+shear lens...
Ricardo Pérez-Alcocer
2016-01-01
Full Text Available This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem, we have adapted the lαβ color space to identify features of interest in underwater images even in extreme visibility conditions. To guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our visual navigation system in real environments showing the feasibility of our approach.
Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems
Zhe Dong
2016-01-01
The modular high temperature gas-cooled reactor (MHTGR) is a typical small modular reactor (SMR) that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the mult...
Twellmann, Thorsten; Meyer-Baese, Anke; Lange, Oliver; Foo, Simon; Nattkemper, Tim W
2008-03-01
Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) has become an important tool in breast cancer diagnosis, but evaluation of multitemporal 3D image data holds new challenges for human observers. To aid the image analysis process, we apply supervised and unsupervised pattern recognition techniques for computing enhanced visualizations of suspicious lesions in breast MRI data. These techniques represent an important component of future sophisticated computer-aided diagnosis (CAD) systems and support the visual exploration of spatial and temporal features of DCE-MRI data stemming from patients with confirmed lesion diagnosis. By taking into account the heterogeneity of cancerous tissue, these techniques reveal signals with malignant, benign and normal kinetics. They also provide a regional subclassification of pathological breast tissue, which is the basis for pseudo-color presentations of the image data. Intelligent medical systems are expected to have substantial implications in healthcare politics by contributing to the diagnosis of indeterminate breast lesions by non-invasive imaging. PMID:19255616
Fu, Jie; Li, Peidong; Wang, Yuan; Liao, Guanyao; Yu, Miao
2016-03-01
This paper addresses the problem of micro-vibration control of a precision vibration isolation system with a magnetorheological elastomer (MRE) isolator and fuzzy control strategy. Firstly, a polyurethane matrix MRE isolator working in the shear-compression mixed mode is introduced. The dynamic characteristic is experimentally tested, and the range of the frequency shift and the model parameters of the MRE isolator are obtained from experimental results. Secondly, a new semi-active control law is proposed, which uses isolation structure displacement and relative displacement between the isolation structure and base as the inputs. Considering the nonlinearity of the MRE isolator and the excitation uncertainty of an isolation system, the designed semi-active fuzzy logic controller (FLC) is independent of a system model and is robust. Finally, the numerical simulations and experiments are conducted to evaluate the performance of the FLC with single-frequency and multiple-frequency excitation, respectively, and the experimental results show that the acceleration transmissibility is reduced by 54.04% at most, which verifies the effectiveness of the designed semi-active FLC. Moreover, the advantages of the approach are demonstrated in comparison to the passive control and ON-OFF control.
Markus Helmer
Full Text Available Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain. They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even
Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian
2016-01-01
Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus
Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian
2016-01-01
Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus
Gill, Michelle L; Byrd, R Andrew; Palmer Iii, Arthur G
2016-02-17
Intrinsically disordered proteins (IDPs) and proteins with intrinsically disordered regions (IDRs) are known to play important roles in regulatory and signaling pathways. A critical aspect of these functions is the ability of IDP/IDRs to form highly specific complexes with target molecules. However, elucidation of the contributions of conformational dynamics to function has been limited by challenges associated with structural heterogeneity of IDP/IDRs. Using NMR spin relaxation parameters ((15)N R1, (15)N R2, and {(1)H}-(15)N heteronuclear NOE) collected at four static magnetic fields ranging from 14.1 to 21.1 T, we have analyzed the backbone dynamics of the basic leucine-zipper (bZip) domain of the Saccharomyces cerevisiae transcription factor GCN4, whose DNA binding domain is intrinsically disordered in the absence of DNA substrate. We demonstrate that the extended model-free analysis can be applied to proteins with IDRs such as apo GCN4 and that these results significantly extend previous NMR studies of GCN4 dynamics performed using a single static magnetic field of 11.74 T [Bracken, et al., J. Mol. Biol., 1999, 285, 2133-2146] and correlate well with molecular dynamics simulations [Robustelli, et al., J. Chem. Theory Comput., 2013, 9, 5190-5200]. In contrast to the earlier work, data at multiple static fields allows the time scales of internal dynamics of GCN4 to be reliably quantified. Large amplitude dynamic fluctuations in the DNA-binding region have correlation times (τs ≈ 1.4-2.5 ns) consistent with a two-step mechanism in which partially ordered bZip conformations of GCN4 form initial encounter complexes with DNA and then rapidly rearrange to the high affinity state with fully formed basic region recognition helices. PMID:26661739
Re-evaluation of the model-free analysis of fast internal motion in proteins using NMR relaxation.
Frederick, Kendra King; Sharp, Kim A; Warischalk, Nicholas; Wand, A Joshua
2008-09-25
NMR spin relaxation retains a central role in the characterization of the fast internal motion of proteins and their complexes. Knowledge of the distribution and amplitude of the motion of amino acid side chains is critical for the interpretation of the dynamical proxy for the residual conformational entropy of proteins, which can potentially significantly contribute to the entropy of protein function. A popular treatment of NMR relaxation phenomena in macromolecules dissolved in liquids is the so-called model-free approach of Lipari and Szabo. The robustness of the mode-free approach has recently been strongly criticized and the remarkable range and structural context of the internal motion of proteins, characterized by such NMR relaxation techniques, attributed to artifacts arising from the model-free treatment, particularly with respect to the symmetry of the underlying motion. We develop an objective quantification of both spatial and temporal asymmetry of motion and re-examine the foundation of the model-free treatment. Concerns regarding the robustness of the model-free approach to asymmetric motion appear to be generally unwarranted. The generalized order parameter is robustly recovered. The sensitivity of the model-free treatment to asymmetric motion is restricted to the effective correlation time, which is by definition a normalized quantity and not a true time constant and therefore of much less interest in this context. With renewed confidence in the model-free approach, we then examine the microscopic distribution of side chain motion in the complex between calcium-saturated calmodulin and the calmodulin-binding domain of the endothelial nitric oxide synthase. Deuterium relaxation is used to characterize the motion of methyl groups in the complex. A remarkable range of Lipari-Szabo model-free generalized order parameters are seen with little correlation with basic structural parameters such as the depth of burial. These results are contrasted with the
Renaudo, Erwan; Girard, Benoît; Chatila, Raja; Khamassi, Mehdi
2015-01-01
International audience Combining model-based and model-free reinforcement learning systems in robotic cognitive architectures appears as a promising direction to endow artificial agents with flexibility and decisional autonomy close to mammals. In particular, it could enable robots to build an internal model of the environment, plan within it in response to detected environmental changes, and avoid the cost and time of planning when the stability of the environment is recognized as enablin...
In general, analysis of quasi-elastic neutron scattering spectra needs some mathematical models in its process, and hence the obtained result is a model dependent. Model-dependent analysis may lead misunderstandings caused by inappropriate initial models or may miss an unexpected relaxation phenomenon. We have developed an analysis method for processing QENS data without a specific model, which we call as mode-distribution analysis. In this method, we supposed that all modes can be described as combinations of the relaxations based on the exponential law. By this method, we can obtain a distribution function B(Q,Γ) which we call the mode-distribution function, to represent the number of relaxation modes and distributions of the relaxation times in the modes. We report the first application to experimental data of liquid water. In addition to the two known modes, the existence of a relaxation mode of water molecules with an intermediate time scale has been discovered. (author)
Connectivity concordance mapping: a new tool for model-free analysis of fMRI data of the human brain
Gabriele eLohmann
2012-03-01
Full Text Available Functional magnetic resonance data acquired in a task-absent condition ("resting state'' require new data analysis techniques that do not depend on an activation model. Here, we propose a new analysis method called "Connectivity Concordance Mapping (CCM".The main idea is to assign a label to each voxel based on the reproducibility of its whole-brain pattern of connectivity. Specifically, we compute the correlations across measurements of each voxel's correlation-based functional connectivity map, resulting in a voxelwise map of concordance values. Regions of high interscan concordance can be assumed to be functionally consistent, and may thus be of specific interest for further analysis. Here we present two fMRI studies to test the algorithm. The first is a eyes open/eyes closed paradigm designed to highlight the potential of the method in a relatively simple state-dependent domain. The second study is a longitudinal repeated measurement of a patient following stroke. Longitudinal clinical studies such as this may represent the most interesting domain of applications for this algorithm, as it provides an exploratory means to identify changes in connectivity, such as those during post-stroke recovery.
Model-free distributed learning
Dembo, Amir; Kailath, Thomas
1990-01-01
Model-free learning for synchronous and asynchronous quasi-static networks is presented. The network weights are continuously perturbed, while the time-varying performance index is measured and correlated with the perturbation signals; the correlation output determines the changes in the weights. The perturbation may be either via noise sources or orthogonal signals. The invariance to detailed network structure mitigates large variability between supposedly identical networks as well as implementation defects. This local, regular, and completely distributed mechanism requires no central control and involves only a few global signals. Thus it allows for integrated on-chip learning in large analog and optical networks.
Can model-free reinforcement learning explain deontological moral judgments?
Ayars, Alisabeth
2016-05-01
Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. PMID:26918742
Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)
2014-05-15
To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)
To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)
Internal motions at specific locations through yeast phenylalanine tRNA were measured by using nucleic acid biosynthetically enriched in 13C at modified base methyl groups. Carbon NMR spectra of isotopically enriched tRNA/sup Phe/ reveal 12 individual peaks for 13 of the 14 methyl groups known to be present. The two methyls of N2, N2-dimethylguanosine (m22G-26) have indistinguishable resonances, whereas the fourteenth methyl bound to ring carbon-11 of the hypermodified nucleoside 3' adjacent to the anticodon, wyosine (Y-37), does not come from the [methyl-13C] methionine substrate. Assignments to individual nucleosides within the tRNA were made on the basis of chemical shifts of the mononucleosides and correlation of 13C resonances with proton NMR chemical shifts via two-dimensional heteronuclear proton-carbon correlation spectroscopy. Values of 13C longitudinal relaxation (T1) and the nuclear Overhauser enhancements (NOE) were determined at 22.5, 75.5, and 118 MHz for tRNA/sup Phe/ in a physiological buffer solution with 10 mM MgCl2, at 220C. These data were used to extract two physical parameters that define the system with regard to fast internal motion: the generalized order parameters (S2) and effective correlation times (tau/sub e/) for internal motion of the C-H internuclear vectors. For all methyl groups the generalized order parameter varied from 0.057 to 0.108, compared with the value of 0.111 predicted for a rapidly spinning methyl group rigidly mounted on a spherical macromolecule. Values of tau/sub e/ ranged from 4 to 16 ps, generally shorter times than measured in other work for amino acid methyl groups in several proteins. Somewhat surprising was the finding that the two methyl esters terminating the Y-37 side chain have order parameters similar to those of other methyls in tRNA and only 25% less than that for a methyl directly bonded to the base
Model-free learning from demonstration
Billing, Erik; Hellström, Thomas; Janlert, Lars Erik
2010-01-01
A novel robot learning algorithm called Predictive Sequence Learning (PSL) is presented and evaluated. PSL is a model-free prediction algorithm inspired by the dynamic temporal difference algorithm S-Learning. While S-Learning has previously been applied as a reinforcement learning algorithm for robots, PSL is here applied to a Learning from Demonstration problem. The proposed algorithm is evaluated on four tasks using a Khepera II robot. PSL builds a model from demonstrated data which is use...
Model-Free Adaptive Control Algorithm with Data Dropout Compensation
Xuhui Bu
2012-01-01
Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.
A Survey on Applications of Model-Free Strategy Learning in Cognitive Wireless Networks
Wang, Wenbo; Kwasinski, Andres; Niyato, Dusit; Han, Zhu
2015-01-01
Model-free learning has been considered as an efficient tool for designing control mechanisms when the model of the system environment or the interaction between the decision-making entities is not available as a-priori knowledge. With model-free learning, the decision-making entities adapt their behaviors based on the reinforcement from their interaction with the environment and are able to (implicitly) build the understanding of the system through trial-and-error mechanisms. Such characteri...
Radac, Mircea-Bogdan; Precup, Radu-Emil
2016-05-01
This paper presents the design and experimental validation of a new model-free data-driven iterative reference input tuning (IRIT) algorithm that solves a reference trajectory tracking problem as an optimization problem with control signal saturation constraints and control signal rate constraints. The IRIT algorithm design employs an experiment-based stochastic search algorithm to use the advantages of iterative learning control. The experimental results validate the IRIT algorithm applied to a non-linear aerodynamic position control system. The results prove that the IRIT algorithm offers the significant control system performance improvement by few iterations and experiments conducted on the real-world process and model-free parameter tuning.
Highlights: • Model-free integral kinetics method and analytical TGA–FTIR were conducted on pyrolysis process of PKS. • The pyrolysis mechanism of PKS was elaborated. • Thermal stability was established: lignin > cellulose > xylan. • Detailed compositions in the volatiles of PKS pyrolysis were determinated. • The interaction of biomass three components led to the fluctuation of activation energy in PKS pyrolysis. - Abstract: Palm kernel shell (PKS) from palm oil production is a potential biomass source for bio-energy production. A fundamental understanding of PKS pyrolysis behavior and kinetics is essential to its efficient thermochemical conversion. The thermal degradation profile in derivative thermogravimetry (DTG) analysis shown two significant mass-loss peaks mainly related to the decomposition of hemicellulose and cellulose respectively. This characteristic differentiated with other biomass (e.g. wheat straw and corn stover) presented just one peak or accompanied with an extra “shoulder” peak (e.g. wheat straw). According to the Fourier transform infrared spectrometry (FTIR) analysis, the prominent volatile components generated by the pyrolysis of PKS were CO2 (2400–2250 cm−1 and 586–726 cm−1), aldehydes, ketones, organic acids (1900–1650 cm−1), and alkanes, phenols (1475–1000 cm−1). The activation energy dependent on the conversion rate was estimated by two model-free integral methods: Flynn–Wall–Ozawa (FWO) and Kissinger–Akahira–Sunose (KAS) method at different heating rates. The fluctuation of activation energy can be interpreted as a result of interactive reactions related to cellulose, hemicellulose and lignin degradation, occurred in the pyrolysis process. Based on TGA–FTIR analysis and model free integral kinetics method, the pyrolysis mechanism of PKS was elaborated in this paper
Model-free 3D face shape reconstruction from video sequences
Dam, van Chris; Veldhuis, Raymond; Spreeuwers, Luuk
2013-01-01
In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence much video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm based on 2D lan
Totally Model-Free Learned Skillful Coping
Dreyfus, Stuart E.
2004-01-01
The author proposes a neural-network-based explanation of how a brain might acquire intuitive expertise. The explanation is intended merely to be suggestive and lacks many complexities found in even lower animal brains. Yet significantly, even this simplified brain model is capable of explaining the acquisition of simple skills without developing…
Trajectory Based Traffic Analysis
Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin;
2013-01-01
We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most of...
A novel model-free approach for reconstruction of time-delayed gene regulatory networks
JIANG; Wei; LI; Xia; GUO; Zheng; LI; Chuanxing; WANG; Lihong
2006-01-01
Reconstruction of genetic networks is one of the key scientific challenges in functional genomics. This paper describes a novel approach for addressing the regulatory dependencies between genes whose activities can be delayed by multiple units of time. The aim of the proposed approach termed TdGRN (time-delayed gene regulatory networking) is to reversely engineer the dynamic mechanisms of gene regulations, which is realized by identifying the time-delayed gene regulations through supervised decision-tree analysis of the newly designed time-delayed gene expression matrix, derived from the original time-series microarray data. A permutation technique is used to determine the statistical classification threshold of a tree, from which a gene regulatory rule(s) is extracted. The proposed TdGRN is a model-free approach that attempts to learn the underlying regulatory rules without relying on any model assumptions. Compared with model-based approaches, it has several significant advantages: it requires neither any arbitrary threshold for discretization of gene transcriptional values nor the definition of the number of regulators (k). We have applied this novel method to the publicly available data for budding yeast cell cycling. The numerical results demonstrate that most of the identified time-delayed gene regulations have current biological knowledge supports.
Model-free adaptive control of advanced power plants
Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang
2015-08-18
A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.
A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series
孙青华; 张世英; 梁雄健
2003-01-01
In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.
Policy improvement by a model-free Dyna architecture.
Hwang, Kao-Shing; Lo, Chia-Yue
2013-05-01
The objective of this paper is to accelerate the process of policy improvement in reinforcement learning. The proposed Dyna-style system combines two learning schemes, one of which utilizes a temporal difference method for direct learning; the other uses relative values for indirect learning in planning between two successive direct learning cycles. Instead of establishing a complicated world model, the approach introduces a simple predictor of average rewards to actor-critic architecture in the simulation (planning) mode. The relative value of a state, defined as the accumulated differences between immediate reward and average reward, is used to steer the improvement process in the right direction. The proposed learning scheme is applied to control a pendulum system for tracking a desired trajectory to demonstrate its adaptability and robustness. Through reinforcement signals from the environment, the system takes the appropriate action to drive an unknown dynamic to track desired outputs in few learning cycles. Comparisons are made between the proposed model-free method, a connectionist adaptive heuristic critic, and an advanced method of Dyna-Q learning in the experiments of labyrinth exploration. The proposed method outperforms its counterparts in terms of elapsed time and convergence rate. PMID:24808427
Elastic fixed window scans of incoherent neutron scattering are an established and frequently employed method to study dynamical changes, usually over a broad temperature range or during a process such as a conformational change in the sample. In particular, the apparent mean-squared displacement can be extracted via a model-free analysis based on a solid physical interpretation as an effective amplitude of molecular motions. Here, we provide a new account of elastic and inelastic fixed window scans, defining a generalized mean-squared displacement for all fixed energy transfers. We show that this generalized mean-squared displacement in principle contains all information on the real mean-square displacement accessible in the instrumental time window. The derived formula provides a clear understanding of the effects of instrumental resolution on the apparent mean-squared displacement. Finally, we show that the generalized mean-square displacement can be used as a model-free indicator on confinement effects within the instrumental time window. (authors)
The key to obtaining the model-free description of the dynamics of a macromolecule is the optimisation of the model-free and Brownian rotational diffusion parameters using the collected R1, R2 and steady-state NOE relaxation data. The problem of optimising the chi-squared value is often assumed to be trivial, however, the long chain of dependencies required for its calculation complicates the model-free chi-squared space. Convolutions are induced by the Lorentzian form of the spectral density functions, the linear recombinations of certain spectral density values to obtain the relaxation rates, the calculation of the NOE using the ratio of two of these rates, and finally the quadratic form of the chi-squared equation itself. Two major topological features of the model-free space complicate optimisation. The first is a long, shallow valley which commences at infinite correlation times and gradually approaches the minimum. The most severe convolution occurs for motions on two timescales in which the minimum is often located at the end of a long, deep, curved tunnel or multidimensional valley through the space. A large number of optimisation algorithms will be investigated and their performance compared to determine which techniques are suitable for use in model-free analysis. Local optimisation algorithms will be shown to be sufficient for minimisation not only within the model-free space but also for the minimisation of the Brownian rotational diffusion tensor. In addition the performance of the programs Modelfree and Dasha are investigated. A number of model-free optimisation failures were identified: the inability to slide along the limits, the singular matrix failure of the Levenberg-Marquardt minimisation algorithm, the low precision of both programs, and a bug in Modelfree. Significantly, the singular matrix failure of the Levenberg-Marquardt algorithm occurs when internal correlation times are undefined and is greatly amplified in model-free analysis by both
Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)
2015-01-01
Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.
A model-free, fully automated baseline-removal method for Raman spectra.
Schulze, H Georg; Foist, Rod B; Okuda, Kadek; Ivanov, André; Turner, Robin F B
2011-01-01
We present here a fully automated spectral baseline-removal procedure. The method uses a large-window moving average to estimate the baseline; thus, it is a model-free approach with a peak-stripping method to remove spectral peaks. After processing, the baseline-corrected spectrum should yield a flat baseline and this endpoint can be verified with the χ(2)-statistic. The approach provides for multiple passes or iterations, based on a given χ(2)-statistic for convergence. If the baseline is acceptably flat given the χ(2)-statistic after the first pass at correction, the problem is solved. If not, the non-flat baseline (i.e., after the first effort or first pass at correction) should provide an indication of where the first pass caused too much or too little baseline to be subtracted. The second pass thus permits one to compensate for the errors incurred on the first pass. Thus, one can use a very large window so as to avoid affecting spectral peaks--even if the window is so large that the baseline is inaccurately removed--because baseline-correction errors can be assessed and compensated for on subsequent passes. We start with the largest possible window and gradually reduce it until acceptable baseline correction based on the χ(2) statistic is achieved. Results, obtained on both simulated and measured Raman data, are presented and discussed. PMID:21211157
Certification-Based Process Analysis
Knight, Russell L.
2013-01-01
Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.
High-Frequency and Model-Free Volatility Estimators
Robert Ślepaczuk; Grzegorz Zakrzewski
2009-01-01
This paper focuses on volatility of financial markets, which is one of the most important issues in finance, especially with regard to modeling high-frequency data. Risk management, asset pricing and option valuation techniques are the areas where the concept of volatility estimators (consistent, unbiased and the most efficient) is of crucial concern. Our intention was to find the best estimator of true volatility taking into account the latest investigations in finance literature. Basing on ...
Isothermal Kinetics of the Pentlandite Exsolution from mss/Pyrrhotite Using Model-Free Method
WANG Haipeng
2006-01-01
The pentlandite exsolution from monosulfide solid solution (mss)/pyrrhotite exsolution is a complex multi-step process, including nucleation, new phase growth and atomic diffusion, and lamellae coarsening.Some of these steps occur in sequence, others simultaneously. These make its kinetic analysis difficult, as the mechanisms cannot be elucidated in detail. In mineral reactions of this type, the true functional form of the reaction model is almost never known, and the Arrhenius parameters determined by the classic Avrami method are skewed to compensate for errors in the model. The model-free kinetics allows a universal determination of activation energy. Kinetic study of pentlandite exsolution from mss/pyrrhotite was performed over the temperature range 200 to 300℃. For mss/pyrrhotite with bulk composition (Fe0.77Ni0.19)S, activation during the course of solid reaction with the extent of reaction. The surrounding environment of reactant atoms affects the atom's activity and more or less accounts for changes of activation energy Ea.
Model-Free Adaptive Fuzzy Sliding Mode Controller Optimized by Particle Swarm for Robot Manipulator
Amin Jalali
2013-05-01
Full Text Available The main purpose of this paper is to design a suitable control scheme that confronts the uncertainties in a robot. Sliding mode controller (SMC is one of the most important and powerful nonlinear robust controllers which has been applied to many non-linear systems. However, this controller has some intrinsic drawbacks, namely, the chattering phenomenon, equivalent dynamic formulation, and sensitivity to the noise. This paper focuses on applying artificial intelligence integrated with the sliding mode control theory. Proposed adaptive fuzzy sliding mode controller optimized by Particle swarm algorithm (AFSMC-PSO is a Mamdani’s error based fuzzy logic controller (FLS with 7 rules integrated with sliding mode framework to provide the adaptation in order to eliminate the high frequency oscillation (chattering and adjust the linear sliding surface slope in presence of many different disturbances and the best coefficients for the sliding surface were found by offline tuning Particle Swarm Optimization (PSO. Utilizing another fuzzy logic controller as an impressive manner to replace it with the equivalent dynamic part is the main goal to make the model free controller which compensate the unknown system dynamics parameters and obtain the desired control performance without exact information about the mathematical formulation of model.
ANALYSIS-BASED SPARSE RECONSTRUCTION WITH SYNTHESIS-BASED SOLVERS
Cleju, Nicolae; Jafari, Maria,; Plumbley, Mark D.
2012-01-01
Analysis based reconstruction has recently been introduced as an alternative to the well-known synthesis sparsity model used in a variety of signal processing areas. In this paper we convert the analysis exact-sparse reconstruction problem to an equivalent synthesis recovery problem with a set of additional constraints. We are therefore able to use existing synthesis-based algorithms for analysis-based exact-sparse recovery. We call this the Analysis-By-Synthesis (ABS) approach. We evaluate o...
A generic model-free approach for lithium-ion battery health management
Highlights: • A new ANN based battery model is developed and integrated with the Kalman filtering technique for battery health management. • The developed ANN based model can be updated along with the Kalman filtering process at the battery operating stage. • The developed model is adaptive and eliminates the dependency of expensive empirical battery models. • The developed approach enables accurate estimations of both short term SoC and long term capacity. • Experimental results demonstrated the efficacy of the developed battery health state estimation approach. - Abstract: Accurate estimation of the state-of-charge (SoC) and state-of-health (SoH) for an operating battery system, as a critical task for battery health management, greatly depends on the validity and generalizability of battery models. Due to the variability and uncertainties involved in battery design, manufacturing and operation, developing a generally applicable battery model remains as a grand challenge for battery health management. To eliminate the dependency of SoC and SoH estimation on battery physical models, this paper presents a generic data-driven approach that integrates an artificial neural network with a dual extended Kalman filter (DEKF) algorithm for lithium-ion battery health management. The artificial neural network is first trained offline to model the battery terminal voltages and the DEKF algorithm can then be employed online for SoC and SoH estimation, where voltage outputs from the trained artificial neural network model are used in DEKF state–space equations to replace the required battery models. The trained neural network model can be adaptively updated to account for the battery to battery variability, thus ensuring good SoC and SoH estimation accuracy. Experimental results are used to demonstrate the effectiveness of the developed model-free approach for battery health management
Probabilistic Model-Based Safety Analysis
Güdemann, Matthias; 10.4204/EPTCS.28.8
2010-01-01
Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...
Feature-based sentiment analysis with ontologies
Taner, Berk
2011-01-01
Sentiment analysis is a topic that many researchers work on. In recent years, new research directions under sentiment analysis appeared. Feature-based sentiment analysis is one such topic that deals not only with finding sentiment in a sentence but providing a more detailed analysis on a given domain. In the beginning researchers focused on commercial products and manually generated list of features for a product. Then they tried to generate a feature-based approach to attach sentiments to th...
Model-free adaptive control optimization using a chaotic particle swarm approach
It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with CPSOH
Andary, Sébastien; Chemori, Ahmed; Benoit, Michel; Sallantin, Jean
2012-01-01
This paper deals with a new method allowing recent model-free control technique to deal with underactuated mechanical systems for stable limit cycles generation. A model-free controller is designed in order to track some parametrized reference trajectories. A second model-free controller is then designed using trajectories' parameters as control inputs in order to stabilize the internal dynamics. The proposed method is applied to a real underactuated mechanical system: the inertia wheel inver...
Pareto analysis based on records
Doostparast, M
2012-01-01
Estimation of the parameters of an exponential distribution based on record data has been treated by Samaniego and Whitaker (1986) and Doostparast (2009). Recently, Doostparast and Balakrishnan (2011) obtained optimal confidence intervals as well as uniformly most powerful tests for one- and two-sided hypotheses concerning location and scale parameters based on record data from a two-parameter exponential model. In this paper, we derive optimal statistical procedures including point and interval estimation as well as most powerful tests based on record data from a two-parameter Pareto model. For illustrative purpose, a data set on annual wages of a sample production-line workers in a large industrial firm is analyzed using the proposed procedures.
TEXTURE ANALYSIS BASED IRIS RECOGNITION
GÜRKAN, Güray; AKAN, Aydın
2012-01-01
In this paper, we present a new method for personal identification, based on iris patterns. The method composed of iris image acquisition, image preprocessing, feature extraction and finally decision stages. Normalized iris images are vertically log-sampled and filtered by circular symmetric Gabor filters. The output of filters are windowed and mean absolute deviation of pixels in the window are calculated as the feature vectors. The proposed method has the desired properties of an iris reco...
ROAn, a ROOT based Analysis Framework
Lauf, Thomas
2013-01-01
The ROOT based Offline and Online Analysis (ROAn) framework was developed to perform data analysis on data from Depleted P-channel Field Effect Transistor (DePFET) detectors, a type of active pixel sensors developed at the MPI Halbleiterlabor (HLL). ROAn is highly flexible and extensible, thanks to ROOT's features like run-time type information and reflection. ROAn provides an analysis program which allows to perform configurable step-by-step analysis on arbitrary data, an associated suite of algorithms focused on DePFET data analysis, and a viewer program for displaying and processing online or offline detector data streams. The analysis program encapsulates the applied algorithms in objects called steps which produce analysis results. The dependency between results and thus the order of calculation is resolved automatically by the program. To optimize algorithms for studying detector effects, analysis parameters are often changed. Such changes of input parameters are detected in subsequent analysis runs and...
Anari, Ali
2012-01-01
ai"The trend is your friend"is a practical principle often used by business managers, who seek to forecast future sales, expenditures, and profitability in order to make production and other operational decisions. The problem is how best to identify and discover business trends and utilize trend information for attaining objectives of firms.This book contains an Excel-based solution to this problem, applying principles of the authors' "profit system model" of the firm that enables forecasts of trends in sales, expenditures, profits and other business variables. The program,
Statistical analysis of life history calendar data
Eerola, Mervi; Helske, Satu
2016-01-01
The life history calendar is a data-collection tool for obtaining reliable retrospective data about life events. To illustrate the analysis of such data, we compare the model-based probabilistic event history analysis and the model-free data mining method, sequence analysis. In event history analysis, we estimate instead of transition hazards the cumulative prediction probabilities of life events in the entire trajectory. In sequence analysis, we compare several dissimilarity metrics and cont...
JAVA based LCD Reconstruction and Analysis Tools
We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities
Java based LCD reconstruction and analysis tools
We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities
Reliability analysis of software based safety functions
The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)
Curvelet Based Offline Analysis of SEM Images
Shirazi, Syed Hamad; Haq, Nuhman ul; Hayat, Khizar; Naz, Saeeda; Haque, Ihsan ul
2014-01-01
Manual offline analysis, of a scanning electron microscopy (SEM) image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method ...
Analysis of a Chaotic Memristor Based Oscillator
F. Setoudeh
2014-01-01
Full Text Available A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT. Then, using Advance Design System (ADS software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of the paper.
Analysis of a Chaotic Memristor Based Oscillator
F. Setoudeh; Khaki Sedigh, A.; Dousti, M
2014-01-01
A chaotic oscillator based on the memristor is analyzed from a chaos theory viewpoint. Sensitivity to initial conditions is studied by considering a nonlinear model of the system, and also a new chaos analysis methodology based on the energy distribution is presented using the Discrete Wavelet Transform (DWT). Then, using Advance Design System (ADS) software, implementation of chaotic oscillator based on the memristor is considered. Simulation results are provided to show the main points of t...
A Model-Free No-arbitrage Price Bound for Variance Options
Bonnans, J. Frederic, E-mail: frederic.bonnans@inria.fr [Ecole Polytechnique, INRIA-Saclay (France); Tan Xiaolu, E-mail: xiaolu.tan@polytechnique.edu [Ecole Polytechnique, CMAP (France)
2013-08-01
We suggest a numerical approximation for an optimization problem, motivated by its applications in finance to find the model-free no-arbitrage bound of variance options given the marginal distributions of the underlying asset. A first approximation restricts the computation to a bounded domain. Then we propose a gradient projection algorithm together with the finite difference scheme to solve the optimization problem. We prove the general convergence, and derive some convergence rate estimates. Finally, we give some numerical examples to test the efficiency of the algorithm.
Improved Frechet bounds and model-free pricing of multi-asset options
Tankov, Peter
2010-01-01
We compute the improved bounds on the copula of a bivariate random vector when partial information is available, such as the values of the copula on the subset of $[0,1]^2$, or the value of a functional of the copula, monotone with respect to the concordance order. These results are then used to compute model-free bounds on the prices of two-asset options which make use of extra information about the dependence structure, such as the price of another two-asset option.
Epoch-based analysis of speech signals
B Yegnanarayana; Suryakanth V Gangashetty
2011-10-01
Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is ﬁxed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields signiﬁcant beneﬁts for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.
Texture-based analysis of COPD
Sørensen, Lauge Emil Borch Laurs; Nielsen, Mads; Lo, Pechin Chien Pau;
2012-01-01
This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based on...... subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density. The...
Cloud Based Development Issues: A Methodical Analysis
Sukhpal Singh
2012-11-01
Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.
Polyphase Order Analysis Based on Convolutional Approach
M. Drutarovsky
1999-06-01
Full Text Available The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational speed. The paper presents optimised algorithm for polyphase order analysis based on non power of two DFT algorithm efficiently implemented by chirp FFT algorithm. Proposed algorithm decreases complexity of digital resampling algorithm, which is the most complex part of complete spectral order algorithm.
Security Analysis of Discrete Logarithm Based Cryptosystems
WANG Yuzhu; LIAO Xiaofeng
2006-01-01
Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.
Social Network Analysis Based on Network Motifs
Xu Hong-lin; Yan Han-bing; Gao Cui-fang; Zhu Ping
2014-01-01
Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...
Swarm Intelligence Based Algorithms: A Critical Analysis
Yang, Xin-She
2014-01-01
Many optimization algorithms have been developed by drawing inspiration from swarm intelligence (SI). These SI-based algorithms can have some advantages over traditional algorithms. In this paper, we carry out a critical analysis of these SI-based algorithms by analyzing their ways to mimic evolutionary operators. We also analyze the ways of achieving exploration and exploitation in algorithms by using mutation, crossover and selection. In addition, we also look at algorithms using dynamic sy...
What Now? Some Brief Reflections on Model-Free Data Analysis
Richard Berk
2009-01-01
David Freedman’s critique of causal modeling in the social and biomedical sciences was fundamental. In his view, the enterprise was misguided, and there was no technical fix. Far too often, there was a disconnect between what the statistical methods required and the substantive information that could be brought to bear. In this paper, I briefly consider some alternatives to causal modeling assuming that David Freedman’s perspective on modeling is correct. In addition to randomized experiments...
Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information
Tao You
2015-06-01
Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.
Abstraction based Analysis and Arbiter Synthesis
Ernits, Juhan-Peep; Yi, Wang
2004-01-01
The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....
Node-based analysis of species distributions
Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;
2014-01-01
with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...
Modeling Mass Spectrometry Based Protein Analysis
Eriksson, Jan; Fenyö, David
2011-01-01
The success of mass spectrometry based proteomics depends on efficient methods for data analysis. These methods require a detailed understanding of the information value of the data. Here, we describe how the information value can be elucidated by performing simulations using synthetic data.
Zhang, Huaguang; Zhou, Jianguo; Sun, Qiuye;
2016-01-01
terminal voltage and ac frequency. Moreover, the design of the controller is only based on input/output (I/O) measurement data but not the model any more, and the system stability can be guaranteed by the Lyapunov method. The detailed system architecture and proposed control strategies are presented in......: the primary outer-loop dual-droop control method along with secondary control; the inner-loop data-driven model-free adaptive voltage control. Using the proposed scheme, the interlinking converter, just like the hierarchical controlled DG units, will have the ability to regulate and restore the dc...
Canonical analysis based on mutual information
Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack
2015-01-01
Canonical correlation analysis (CCA) is an established multi-variate statistical method for finding similarities between linear combinations of (normally two) sets of multivariate observations. In this contribution we replace (linear) correlation as the measure of association between the linear...... combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...... analysis of variables with different genesis and therefore different statistical distributions and different modalities. As a proof of concept we give a toy example. We also give an example with one (weather radar based) variable in the one set and eight spectral bands of optical satellite data in the...
Network-based analysis of proteomic profiles
Wong, Limsoon
2016-01-26
Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.
TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING
Chen Zhenqiang; Xu Baowen; Guanjie
2003-01-01
Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.
Structure-based analysis of Web sites
Yen, B
2004-01-01
The performance of information retrieval on the Web is heavily influenced by the organization of Web pages, user navigation patterns, and guidance-related functions. Having observed the lack of measures to reflect this factor, this paper focuses on an approach based on both structure properties and navigation data to analyze and improve the performance of Web site. Two types of indices are defined two major factors for analysis and improvement- "aaccessibility" reflects the structure property...
Quantum entanglement analysis based on abstract interpretation
Perdrix, Simon
2008-01-01
Entanglement is a non local property of quantum states which has no classical counterpart and plays a decisive role in quantum information theory. Several protocols, like the teleportation, are based on quantum entangled states. Moreover, any quantum algorithm which does not create entanglement can be efficiently simulated on a classical computer. The exact role of the entanglement is nevertheless not well understood. Since an exact analysis of entanglement evolution induces an exponential sl...
Particle Pollution Estimation Based on Image Analysis
Liu, Chenbin; Tsow, Francis; Zou, Yi; Tao, Nongjian
2016-01-01
Exposure to fine particles can cause various diseases, and an easily accessible method to monitor the particles can help raise public awareness and reduce harmful exposures. Here we report a method to estimate PM air pollution based on analysis of a large number of outdoor images available for Beijing, Shanghai (China) and Phoenix (US). Six image features were extracted from the images, which were used, together with other relevant data, such as the position of the sun, date, time, geographic...
XML-based analysis interface for particle physics data analysis
The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)
Chapter 11. Community analysis-based methods
Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.
2010-05-01
Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.
Key Point Based Data Analysis Technique
Yang, Su; Zhang, Yong
In this paper, a new framework for data analysis based on the "key points" in data distribution is proposed. Here, the key points contain three types of data points: bridge points, border points, and skeleton points, where our main contribution is the bridge points. For each type of key points, we have developed the corresponding detection algorithm and tested its effectiveness with several synthetic data sets. Meanwhile, we further developed a new hierarchical clustering algorithm SPHC (Skeleton Point based Hierarchical Clustering) to demonstrate the possible applications of the key points acquired. Based on some real-world data sets, we experimentally show that SPHC performs better compared with several classical clustering algorithms including Complete-Link Hierarchical Clustering, Single-Link Hierarchical Clustering, KMeans, Ncut, and DBSCAN.
Chip based electroanalytical systems for cell analysis
Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.;
2008-01-01
' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis......This review with 239 references has as its aim to give the reader an introduction to the kinds of methods used for developing microchip based electrode systems as well as to cover the existing literature on electroanalytical systems where microchips play a crucial role for 'nondestructive...... studies and scanning electrochemical microscopy (SECM) studies of living cells have been omitted. Included is also a discussion about some future and emerging nano tools and considerations that might have an impact on the future of "nondestructive" chip based electroanalysis of living cells....
Fault-based analysis of flexible ciphers
V.I.Korjik
2002-07-01
Full Text Available We consider security of some flexible ciphers against differential fault analysis (DFA. We present a description of the fault-based attack on two kinds of the flexible ciphers. The first kind is represented by the fast software-oriented cipher based on data-dependent subkey selection (DDSS, in which flexibility corresponds to the use of key-dependent operations. The second kind is represented by a DES-like cryptosystem GOST with secrete S-boxes. In general, the use of some secrete operations and procedures contributes to the security of the cryptosystem, however degree of this contribution depends significantly on the structure of the encryption mechanism. It is shown how to attack the DDSS-based flexible cipher using DFA though this cipher is secure against standard variants of the differential and linear cryptanalysis. We also give an outline of ciphers RC5 and GOST showing that they are also insecure against DFA-based attack. We suggest also a modification of the DDSS mechanism and a variant of the advanced DDSS-based flexible cipher that is secure against attacks based on random hardware faults.
System based practice: a concept analysis
YAZDANI, SHAHRAM; HOSSEINI, FAKHROLSADAT; AHMADY, SOLEIMAN
2016-01-01
Introduction Systems-Based Practice (SBP) is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion he identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis. PMID:27104198
Dependent failure analysis of NPP data bases
A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)
Gait correlation analysis based human identification.
Chen, Jinyan
2014-01-01
Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x), vertical axis (y), and temporal axis (t). By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features' dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance. PMID:24592144
Gait Correlation Analysis Based Human Identification
Jinyan Chen
2014-01-01
Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.
Multifractal Time Series Analysis Based on Detrended Fluctuation Analysis
Kantelhardt, Jan; Stanley, H. Eugene; Zschiegner, Stephan; Bunde, Armin; Koscielny-Bunde, Eva; Havlin, Shlomo
2002-03-01
In order to develop an easily applicable method for the multifractal characterization of non-stationary time series, we generalize the detrended fluctuation analysis (DFA), which is a well-established method for the determination of the monofractal scaling properties and the detection of long-range correlations. We relate the new multifractal DFA method to the standard partition function-based multifractal formalism, and compare it to the wavelet transform modulus maxima (WTMM) method which is a well-established, but more difficult procedure for this purpose. We employ the multifractal DFA method to determine if the heartrhythm during different sleep stages is characterized by different multifractal properties.
Rweb:Web-based Statistical Analysis
Jeff Banfield
1999-03-01
Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.
Electric Equipment Diagnosis based on Wavelet Analysis
Stavitsky Sergey A.
2016-01-01
Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.
Arabic Interface Analysis Based on Cultural Markers
Mohammadi Akheela Khanum
2012-01-01
Full Text Available This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstedes cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstedes score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.
Similarity-based pattern analysis and recognition
Pelillo, Marcello
2013-01-01
This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg
Arabic Interface Analysis Based on Cultural Markers
Khanum, Mohammadi Akheela; Chaurasia, Mousmi A
2012-01-01
This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.
Video semantic content analysis based on ontology
Bai, Liang; Lao, Songyang; Jones, Gareth J.F.; Smeaton, Alan F.
2007-01-01
The rapid increase in the available amount of video data is creating a growing demand for efficient methods for understanding and managing it at the semantic level. New multimedia standards, such as MPEG-4 and MPEG-7, provide the basic functionalities in order to manipulate and transmit objects and metadata. But importantly, most of the content of video data at a semantic level is out of the scope of the standards. In this paper, a video semantic content analysis framework based on ontology i...
Motion Analysis Based on Invertible Rapid Transform
J. Turan
1999-06-01
Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.
Kinetic modelling of RDF pyrolysis: Model-fitting and model-free approaches.
Çepelioğullar, Özge; Haykırı-Açma, Hanzade; Yaman, Serdar
2016-02-01
In this study, refuse derived fuel (RDF) was selected as solid fuel and it was pyrolyzed in a thermal analyzer from room temperature to 900°C at heating rates of 5, 10, 20, and 50°C/min in N2 atmosphere. The obtained thermal data was used to calculate the kinetic parameters using Coats-Redfern, Friedman, Flylnn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) methods. As a result of Coats-Redfern model, decomposition process was assumed to be four independent reactions with different reaction orders. On the other hand, model free methods demonstrated that activation energy trend had similarities for the reaction progresses of 0.1, 0.2-0.7 and 0.8-0.9. The average activation energies were found between 73-161kJ/mol and it is possible to say that FWO and KAS models produced closer results to the average activation energies compared to Friedman model. Experimental studies showed that RDF may be a sustainable and promising feedstock for alternative processes in terms of waste management strategies. PMID:26613830
Stellar loci II. a model-free estimate of the binary fraction for field FGK stars
Yuan, Haibo; Xiang, Maosheng; Huang, Yang; Chen, Bingqiu
2014-01-01
We propose a Stellar Locus OuTlier (SLOT) method to determine the binary fraction of main-sequence stars statistically. The method is sensitive to neither the period nor mass-ratio distributions of binaries, and able to provide model-free estimates of binary fraction for large numbers of stars of different populations in large survey volumes. We have applied the SLOT method to two samples of stars from the SDSS Stripe 82, constructed by combining the re-calibrated SDSS photometric data with respectively the spectroscopic information from the SDSS and LAMOST surveys. For the SDSS spectroscopic sample, we find an average binary fraction for field FGK stars of $41%\\pm2%$. The fractions decrease toward late spectral types, and are respectively $44%\\pm5%$, $43%\\pm3%$, $35%\\pm5%$, and $28%\\pm6%$ for stars of $g-i$ colors between 0.3 -- 0.6, 0.6 -- 0.9, 0.9 -- 1.2, and 1.2 - 1.6\\,mag. A modest metallicity dependence is also found. The fraction decreases with increasing metallicity. For stars of [Fe/H] between $-0.5$...
A model-free method for annotating on vascular structure in volume rendered images
He, Wei; Li, Yanfang; Shi, Weili; Miao, Yu; He, Fei; Yan, Fei; Yang, Huamin; Zhang, Huimao; Mori, Kensaku; Jiang, Zhengang
2015-03-01
The precise annotation of vessel is desired in computer-assisted systems to help surgeons identify each vessel branch. A method has been reported that annotates vessels on volume rendered images by rendering their names on them using a two-pass rendering process. In the reported method, however, cylinder surface models of the vessels should be generated for writing vessels names. In fact, vessels are not actual cylinders, so the surfaces of the vessels cannot be simulated by such models accurately. This paper presents a model-free method for annotating vessels on volume rendered images by rendering their names on them using the two-pass rendering process: surface rendering and volume rendering. In the surface rendering process, docking points of vessel names are estimated by using such properties as centerlines, running directions, and vessel regions which are obtained in preprocess. Then the vessel names are pasted on the vessel surfaces at the docking points. In the volume rendering process, volume image is rendered using a fast volume rendering algorithm with depth buffer of image rendered in the surface rendering process. Finally, those rendered images are blended into an image as a result. In order to confirm the proposed method, a visualizing system for the automated annotation of abdominal arteries is performed. The experimental results show that vessel names can be drawn on the corresponding vessel in the volume rendered images correctly. The proposed method has enormous potential to be adopted to annotate other organs which cannot be modeled using regular geometrical surface.
Visibility Graph Based Time Series Analysis.
Mutua Stephen
Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.
Curvelet based offline analysis of SEM images.
Syed Hamad Shirazi
Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.
Visual Similarity Based Document Layout Analysis
Di Wen; Xiao-Qing Ding
2006-01-01
In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.
Watermark Resistance Analysis Based On Linear Transformation
N.Karthika Devi
2012-06-01
Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.
Voxel-Based LIDAR Analysis and Applications
Hagstrom, Shea T.
One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To
Cognitive fusion analysis based on context
Blasch, Erik P.; Plano, Susan
2004-04-01
The standard fusion model includes active and passive user interaction in level 5 - "User Refinement". User refinement is more than just details of passive automation partitioning - it is the active management of information. While a fusion system can explore many operational conditions over myopic changes, the user has the ability to reason about the hyperopic "big picture." Blasch and Plano developed cognitive-fusion models that address user constraints including: intent, attention, trust, workload, and throughput to facilitate hyperopic analysis. To enhance user-fusion performance modeling (i.e. confidence, timeliness, and accuracy); we seek to explore the nature of context. Context, the interrelated conditions of which something exists, can be modeled in many ways including geographic, sensor, object, and environmental conditioning. This paper highlights user refinement actions based on context to constrain the fusion analysis for accurately representing the trade space in the real world. As an example, we explore a target identification task in which contextual information from the user"s cognitive model is imparted to a fusion belief filter.
Interactive analysis of geodata based intelligence
Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth
2016-05-01
When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.
Operating cost analysis of anaesthesia: Activity based costing (ABC analysis
Majstorović Branislava M.
2011-01-01
Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.
Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals.
Olav Stetter
Full Text Available A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting. Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections
Automatic malware analysis an emulator based approach
Yin, Heng
2012-01-01
Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy
Polyphase Order Analysis Based on Convolutional Approach
M. Drutarovsky
1999-01-01
The condition of rotating machines can be determined by measuring of periodic frequency components in the vibration signal which are directly related to the (typically changing) rotational speed. Classical spectrum analysis with a constant sampling frequency is not an appropriate analysis method because of spectral smearing. Spectral analysis of vibration signal sampled synchronously with the angle of rotation, known as order analysis, suppress spectral smearing even with variable rotational ...
ANALYSIS OF CIRCUIT TOLERANCE BASED ON RANDOM SET THEORY
无
2008-01-01
Monte Carlo Analysis has been an accepted method for circuit tolerance analysis,but the heavy computational complexity has always prevented its applications.Based on random set theory,this paper presents a simple and flexible tolerance analysis method to estimate circuit yield.It is the alternative to Monte Carlo analysis,but reduces the number of calculations dramatically.
Scope-Based Method Cache Analysis
Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin
requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution, as it...
A Requirements Analysis Model Based on QFD
TANG Zhi-wei; Nelson K.H.Tang
2004-01-01
The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.
Location-based Modeling and Analysis: Tropos-based Approach
Ali, Raian; Dalpiaz, Fabiano; Giorgini, Paolo
2008-01-01
The continuous growth of interest in mobile applications makes the concept of location essential to design and develop software systems. Location-based software is supposed to be able to monitor the location and choose accordingly the most appropriate behavior. In this paper, we propose a novel conceptual framework to model and analyze location-based software. We mainly focus on the social facets of locations adopting concepts such as social actor, resource, and location-based behavior. Our a...
Statistical Analysis of Nonlinear Processes Based on Penalty Factor
Zhang, Yingwei; Zhang, Chuanfang; Zhang, Wei
2014-01-01
A new process monitoring approach is proposed for handling the nonlinear monitoring problem in the electrofused magnesia furnace (EFMF). Compared to conventional method, the contributions are as follows: (1) a new kernel principal component analysis is proposed based on loss function in the feature space; (2) the model of kernel principal component analysis based on forgetting factor is updated; (3) a new iterative kernel principal component analysis algorithm is proposed based on penalty fac...
Web Based Distributed Coastal Image Analysis System Project
National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...
Pathway-Based Functional Analysis of Metagenomes
Bercovici, Sivan; Sharon, Itai; Pinter, Ron Y.; Shlomi, Tomer
Metagenomic data enables the study of microbes and viruses through their DNA as retrieved directly from the environment in which they live. Functional analysis of metagenomes explores the abundance of gene families, pathways, and systems, rather than their taxonomy. Through such analysis researchers are able to identify those functional capabilities most important to organisms in the examined environment. Recently, a statistical framework for the functional analysis of metagenomes was described that focuses on gene families. Here we describe two pathway level computational models for functional analysis that take into account important, yet unaddressed issues such as pathway size, gene length and overlap in gene content among pathways. We test our models over carefully designed simulated data and propose novel approaches for performance evaluation. Our models significantly improve over current approach with respect to pathway ranking and the computations of relative abundance of pathways in environments.
Analysis of Task-based Syllabus
马进胜
2011-01-01
Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.
Market Based Analysis of Power System Interconnections
Obuševs, A; Turcik, M; Oļeiņikova, I; Junghāns, G
2011-01-01
Analysis in this Article is focused on usage of transmission grid under liberalized market with implicit transmission capacity allocation method, e.g. Nordic market. Attention is paid on fundamental changes in transmission utilization and its economical effective operation. For interconnection and power flow analysis and losses calculation model of Nordic grid was developed and transmission losses calculation method was created. Given approach will improve economical efficiency of system oper...
Free-burning arcs where the work piece acts as an anode were numerically analyzed using a computational domain including the arc itself and its anode region based on the local thermodynamic equilibrium model. Because the major arc parameters such as temperature, axial velocity, electric potential difference and pressure-rise from ambient atmospheric pressure are much dependent on the working current, our investigation was concerned with developing a capability to model free-burning argon arcs and considering the energy flux going into the anode at various values of the electrical current (I = 50, 100 and 200 A) by computational fluid dynamics analysis. Predicted temperatures along the z-axis between the electrodes were in fair agreement with existing experimental results. Particularly, reasonable relationships between the maximum velocity or temperature and the applied current were predicted, which matched well with other theoretical results. In addition, some discrepancies with other predictions were shown in the results about electric potential and pressure-rise. It should be related to the omission of the space-charge effect near the electrodes for a simplified unified model and the application of a turbulence model for the steep temperature gradient at the arc edges. - Highlights: • Free-burning argon arcs were investigated at various working currents numerically. • The relationships between the current, the velocity and the temperature were found. • Some discrepancies were shown in the results of pressure and electric potential. • Those should be supplemented by the non-equilibrium situation near electrodes
Description-based and experience-based decisions: individual analysis
Andrey Kudryavtsev
2012-05-01
Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.
Iris recognition based on subspace analysis
Pravin S.Patil
2014-10-01
Full Text Available Biometrics deals with the uniqueness of an individual arising from their physiological or behavioral characteristics for the purpose of personal identification. Among many biometrics techniques, iris recognition is one of the most promising approache. This paper presents traditional subspace analysis method for iris recognition. Initially the eye images have been localized in circular form by using Daugman’s grid method and circular Hough transform method. The algorithms for subspace analysis methods namely PCA and LDA are implemented and experimental results are reported. The comparative performance for both the algorithms has been observed in term of recognition rate. The comprehensive experiments completed on UPOL and CASIA V1 iris databases.
Performance Analysis Based on Timing Simulation
Nielsen, Christian Dalsgaard; Kishinevsky, Michael
Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits....
Environmentally based Cost-Benefit Analysis
The fundamentals of the basic elements of a new comprehensive economic assessment, MILA, developed in Sweden with inspiration from the Total Cost Assessment-model are presented. The core of the MILA approach is an expanded cost and benefit inventory. But MILA also includes a complementary addition of an internal waste stream analysis, a tool for evaluation of environmental conflicts in monetary terms, an extended time horizon and direct allocation of costs and revenues to products and processes. However, MILA does not ensure profitability for environmentally sound projects. Essentially, MILA is an approach of refining investment and profitability analysis of a project, investment or product. 109 refs., 38 figs
Gender-Based Analysis On-Line Dialogue. Final Report.
2001
An online dialogue on gender-based analysis (GBA) was held from February 15 to March 7, 2001. Invitations and a background paper titled "Why Gender-Based Analysis?" were sent to 350 women's organizations and individuals throughout Canada. Efforts were made to ensure that aboriginal and Metis women, visible minority women, and women with special…
Science Based Governance? EU Food Regulation Submitted to Risk Analysis
Szajkowska, A.; Meulen, van der B.M.J.
2014-01-01
Anna Szajkowska and Bernd van der Meulen analyse in their contribution, Science Based Governance? EU Food Regulation Submitted to Risk Analysis, the scope of application of risk analysis and the precautionary principle in EU food safety regulation. To what extent does this technocratic, science-base
SSI Analysis for Base-Isolated Nuclear Power Plants
Safety of NPPs much higher than other structures is required. An earthquake is one of the most important parameters which govern safety of NPPs among external events. Application of base isolation system for NPPs can reduce the risk for earthquakes. At present, a soil structure interaction(SSI) analysis is essential in seismic design of NPPs in consideration of ground structure interaction. In the seismic analysis of the base-isolated NPP, it is restrictive to consider nonlinear properties of seismic isolation bearings due to linear analysis of SSI analysis programs such as SASSI. Thus, in this study, SSI analyses are performed using an iterative approach considering material nonlinearity of isolators. By performing the SSI analysis using an iterative approach, nonlinear properties of isolators can be considered. The results of the SSI analysis show that the response of the base-isolated NPP with base isolation systems is significantly reduced horizontally
Analysis of Financial Position Based on the Balance Sheet
Spineanu-Georgescu Luciana
2011-01-01
Analysis of financial position based on the balance sheet is mainly aimed at assessing the extent to which financial structure chosen by the firm, namely, financial resources, covering the needs reflected in the balance sheet financed. This is done through an analysis known as horizontal analysis balance sheet financial imbalances.
Transportation Mode Choice Analysis Based on Classification Methods
Zeņina, N; Borisovs, A
2011-01-01
Mode choice analysis has received the most attention among discrete choice problems in travel behavior literature. Most traditional mode choice models are based on the principle of random utility maximization derived from econometric theory. This paper investigates performance of mode choice analysis with classification methods - decision trees, discriminant analysis and multinomial logit. Experimental results have demonstrated satisfactory quality of classification.
Encounter-based worms: Analysis and Defense
Tanachaiwiwat, Sapon
2007-01-01
Encounter-based network is a frequently-disconnected wireless ad-hoc network requiring immediate neighbors to store and forward aggregated data for information disseminations. Using traditional approaches such as gateways or firewalls for deterring worm propagation in encounter-based networks is inappropriate. We propose the worm interaction approach that relies upon automated beneficial worm generation aiming to alleviate problems of worm propagations in such networks. To understand the dynamic of worm interactions and its performance, we mathematically model worm interactions based on major worm interaction factors including worm interaction types, network characteristics, and node characteristics using ordinary differential equations and analyze their effects on our proposed metrics. We validate our proposed model using extensive synthetic and trace-driven simulations. We find that, all worm interaction factors significantly affect the pattern of worm propagations. For example, immunization linearly decrea...
Accelerator based techniques for aerosol analysis
At the 3 MV Tandetron accelerator of the LABEC laboratory of INFN (Florence, Italy) an external beam facility is fully dedicated to PIXE-PIGE measurements of elemental composition of atmospheric aerosols. Examples regarding recent monitoring campaigns, performed in urban and remote areas, both on a daily basis and with high time resolution, as well as with size selection, will be presented. It will be evidenced how PIXE can provide unique information in aerosol studies or can play a complementary role to traditional chemical analysis. Finally a short presentation of 14C analysis of the atmospheric aerosol by Accelerator Mass Spectrometry (AMS) for the evaluation of the contributions from either fossil fuel combustion or modern sources (wood burning, biogenic activity) will be given. (author)
Structural Analysis of Plate Based Tensegrity Structures
Hald, Frederik; Kirkegaard, Poul Henning; Damkilde, Lars
2013-01-01
Plate tensegrity structures combine tension cables with a cross laminated timber plate and can then form e.g. a roof structure. The topology of plate tensegrity structures is investigated through a parametric investigation. Plate tensegrity structures are investigated, and a method for...... determination of the structures pre-stresses is used. A parametric investigation is performed to determine a more optimized form of the plate based tensegrity structure. Conclusions of the use of plate based tensegrity in civil engineering and further research areas are discussed....
Symbolic Analysis of OTRAs-Based Circuits
C. Sánchez-López
2011-04-01
Full Text Available A new nullor-based model to describe the behavior of Operational Transresistance Amplifiers (OTRAs is introduced.The new model is composed of four nullors and three grounded resistors. As a consequence, standard nodal analysiscan be applied to compute fully-symbolic small-signal characteristics of OTRA-based analog circuits, and the nullorbasedOTRAs model can be used in CAD tools. In this manner, the fully-symbolic transfer functions of severalapplication circuits, such as filters and oscillators can easily be approximated.
Iris recognition based on subspace analysis
Pravin S.Patil
2014-01-01
Biometrics deals with the uniqueness of an individual arising from their physiological or behavioral characteristics for the purpose of personal identification. Among many biometrics techniques, iris recognition is one of the most promising approache. This paper presents traditional subspace analysis method for iris recognition. Initially the eye images have been localized in circular form by using Daugman’s grid method and circular Hough transform method. The algorithms for subspace analy...
Texton-based analysis of paintings
van der Maaten, Laurens J. P.; Postma, Eric O.
2010-08-01
The visual examination of paintings is traditionally performed by skilled art historians using their eyes. Recent advances in intelligent systems may support art historians in determining the authenticity or date of creation of paintings. In this paper, we propose a technique for the examination of brushstroke structure that views the wildly overlapping brushstrokes as texture. The analysis of the painting texture is performed with the help of a texton codebook, i.e., a codebook of small prototypical textural patches. The texton codebook can be learned from a collection of paintings. Our textural analysis technique represents paintings in terms of histograms that measure the frequency by which the textons in the codebook occur in the painting (so-called texton histograms). We present experiments that show the validity and effectiveness of our technique for textural analysis on a collection of digitized high-resolution reproductions of paintings by Van Gogh and his contemporaries. As texton histograms cannot be easily be interpreted by art experts, the paper proposes to approaches to visualize the results on the textural analysis. The first approach visualizes the similarities between the histogram representations of paintings by employing a recently proposed dimensionality reduction technique, called t-SNE. We show that t-SNE reveals a clear separation of paintings created by Van Gogh and those created by other painters. In addition, the period of creation is faithfully reflected in the t-SNE visualizations. The second approach visualizes the similarities and differences between paintings by highlighting regions in a painting in which the textural structure of the painting is unusual. We illustrate the validity of this approach by means of an experiment in which we highlight regions in a painting by Monet that are not very "Van Gogh-like". Taken together, we believe the tools developed in this study are well capable of assisting for art historians in support of
CLUSTERING-BASED ANALYSIS OF TEXT SIMILARITY
Bovcon , Borja
2013-01-01
The focus of this thesis is comparison of analysis of text-document similarity using clustering algorithms. We begin by defining main problem and then, we proceed to describe the two most used text-document representation techniques, where we present words filtering methods and their importance, Porter's algorithm and tf-idf term weighting algorithm. We then proceed to apply all previously described algorithms on selected data-sets, which vary in size and compactness. Fallowing this, we ...
Wavelet Based Fractal Analysis of Airborne Pollen
Degaudenzi, M. E.; Arizmendi, C. M.
1998-01-01
The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen tim...
Thanatophoric dysplasia: case-based bioethical analysis
Edgar Abarca López
2014-04-01
Full Text Available This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.
Thanatophoric dysplasia: case-based bioethical analysis
Edgar Abarca López; Alejandra Rodríguez Torres; Donovan Casas Patiño; Esteban Espíndola Benítez
2014-01-01
This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.
Movement Pattern Analysis Based on Sequence Signatures
Seyed Hossein Chavoshi
2015-09-01
Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.
Remote sensing based on hyperspectral data analysis
Sharifahmadian, Ershad
In remote sensing, accurate identification of far objects, especially concealed objects is difficult. In this study, to improve object detection from a distance, the hyperspecral imaging and wideband technology are employed with the emphasis on wideband radar. As the wideband data includes a broad range of frequencies, it can reveal information about both the surface of the object and its content. Two main contributions are made in this study: 1) Developing concept of return loss for target detection: Unlike typical radar detection methods which uses radar cross section to detect an object, it is possible to enhance the process of detection and identification of concealed targets using the wideband radar based on the electromagnetic characteristics --conductivity, permeability, permittivity, and return loss-- of materials. During the identification process, collected wideband data is evaluated with information from wideband signature library which has already been built. In fact, several classes (e.g. metal, wood, etc.) and subclasses (ex. metals with high conductivity) have been defined based on their electromagnetic characteristics. Materials in a scene are then classified based on these classes. As an example, materials with high electrical conductivity can be conveniently detected. In fact, increasing relative conductivity leads to a reduction in the return loss. Therefore, metals with high conductivity (ex. copper) shows stronger radar reflections compared with metals with low conductivity (ex. stainless steel). Thus, it is possible to appropriately discriminate copper from stainless steel. 2) Target recognition techniques: To detect and identify targets, several techniques have been proposed, in particular the Multi-Spectral Wideband Radar Image (MSWRI) which is able to localize and identify concealed targets. The MSWRI is based on the theory of robust capon beamformer. During identification process, information from wideband signature library is utilized
Google glass based immunochromatographic diagnostic test analysis
Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan
2015-03-01
Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.
Knowledge-based analysis of phenotypes
Hoendorf, Robert
2016-01-27
Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.
A Goal based methodology for HAZOP analysis
Rossing, Netta Liin; Lind, Morten; Jensen, Niels;
2010-01-01
directly for implementation into a computer aided reasoning tool for HAZOP studies to perform root cause and consequence analysis. Such a tool will facilitate finding causes far away from the site of the deviation. A Functional HAZOP Assistant is proposed and investigated in a HAZOP study of an industrial...... to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of...
Communication Error Analysis Method based on CREAM
Communication error has been considered as a primary reason of many incidents and accidents in nuclear industry. In order to prevent these accidents, an analysis method of communication errors is proposed. This study presents a qualitative method to analyze communication errors. The qualitative method focuses on finding a root cause of the communication error and predicting the type of communication error which could happen in nuclear power plants. We develop context conditions and antecedent-consequent links of influential factors related to communication error. A case study has been conducted to validate the applicability of the proposed methods
Web Application Comprehension Based on Dependence Analysis
WU Jun-hua; XU Bao-wen; JIANG Ji-xiang
2004-01-01
Many research indicate a lot of money and time are spent on maintaining and modifying program delivered.So the policies to support program comprehension are very important.Program comprehension is a crucial and difficult task.Insufficient design, illogical code structure, short documents will enhance the comprehensive difficulty.Developing Web application is usually a process with quick implementation and delivery.In addition, generally a Web application is coded by combining mark language statements with some embedded applets.Such programming mode affects comprehension of Web applications disadvantageously.This paper proposes a method to improving understanding Web by dependence analysis and slice technology.
Analysis and Protection of SIP based Services
Ferdous, Raihana
2014-01-01
Multimedia communications over IP are booming as they offer higher flexibility and more features than traditional voice and video services. IP telephony known as Voice over IP (VoIP) is one of the commercially most important emerging trends in multimedia communications over IP. Due to the flexibility and descriptive power, the Session Initiation Protocol (SIP) is becoming the root of many sessions-based applications such as VoIP and media streaming that are used by a growing number of use...
Analysis of Hashrate-Based Double Spending
Rosenfeld, Meni
2014-01-01
Bitcoin is the world's first decentralized digital currency. Its main technical innovation is the use of a blockchain and hash-based proof of work to synchronize transactions and prevent double-spending the currency. While the qualitative nature of this system is well understood, there is widespread confusion about its quantitative aspects and how they relate to attack vectors and their countermeasures. In this paper we take a look at the stochastic processes underlying typical attacks and th...
Interest Based Financial Intermediation: Analysis and Solutions
Shaikh, Salman
2012-01-01
Interest is prohibited in all monotheist religions. Apart from religion, interest is also regarded as unjust price of money capital by pioneer secular philosophers as well as some renowned economists. However, it is argued by some economists that modern day, market driven interest rate in a competitive financial market is different from usury and that the interest based financial intermediation has served a useful purpose in allocation of resources as well as in allocation of risk, given the ...
Value-Based Analysis of Mobile Tagging
Oguzhan Aygoren; Kaan Varnali
2011-01-01
Innovative use of the mobile medium in delivering customer value presents unprecedented opportunities for marketers. Various types of mobile applications have evolved to provide ubiquitous and instant customer service to capitalize on this opportunity. One application is mobile tagging, a mobile-based innovative tool for convergence marketing. The accumulated academic knowledge on mobile marketing lacks consumer-centric information about this phenomenon. This paper addresses this issue and co...
Confidence-Based Learning in Investment Analysis
Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés
The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.
Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging
Kiuru Aaro; Kormano Martti; Svedström Erkki; Liang Jianming; Järvi Timo
2003-01-01
The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion ana...
Engineering Analysis Using a Web-based Protocol
Schoeffler, James D.; Claus, Russell W.
2002-01-01
This paper reviews the development of a web-based framework for engineering analysis. A one-dimensional, high-speed analysis code called LAPIN was used in this study, but the approach can be generalized to any engineering analysis tool. The web-based framework enables users to store, retrieve, and execute an engineering analysis from a standard web-browser. We review the encapsulation of the engineering data into the eXtensible Markup Language (XML) and various design considerations in the storage and retrieval of application data.
Transfer entropy--a model-free measure of effective connectivity for the neurosciences.
Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon
2011-02-01
Understanding causal relationships, or effective connectivity, between parts of the brain is of utmost importance because a large part of the brain's activity is thought to be internally generated and, hence, quantifying stimulus response relationships alone does not fully describe brain dynamics. Past efforts to determine effective connectivity mostly relied on model based approaches such as Granger causality or dynamic causal modeling. Transfer entropy (TE) is an alternative measure of effective connectivity based on information theory. TE does not require a model of the interaction and is inherently non-linear. We investigated the applicability of TE as a metric in a test for effective connectivity to electrophysiological data based on simulations and magnetoencephalography (MEG) recordings in a simple motor task. In particular, we demonstrate that TE improved the detectability of effective connectivity for non-linear interactions, and for sensor level MEG signals where linear methods are hampered by signal-cross-talk due to volume conduction. PMID:20706781
Network Anomaly Detection Based on Wavelet Analysis
Ali A. Ghorbani
2008-11-01
Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.
Network Anomaly Detection Based on Wavelet Analysis
Lu, Wei; Ghorbani, Ali A.
2008-12-01
Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.
Sandia National Laboratories analysis code data base
Peterson, C.W.
1994-11-01
Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.
Mental EEG Analysis Based on Infomax Algorithm
WUXiao-pei; GuoXiao-jing; ZANGDao-xin; SHENQian
2004-01-01
The patterns of EEG will change with mental tasks performed by the subject. In the field of EEG signal analysis and application, the study to get the patterns of mental EEG and then to use them to classify mental tasks has the significant scientific meaning and great application value. But for the reasons of different artifacts existing in EEG, the pattern detection of EEG under normal mental states is a very difficult problem. In this paper, Independent Component Analysisis applied to EEG signals collected from performing different mental tasks. The experiment results show that when one subject performs a single mental task in different trials, the independent components of EEG are very similar. It means that the independent components can be used as the mental EEG patterns to classify the different mental tasks.
Wavelet Based Fractal Analysis of Airborne Pollen
Degaudenzi, M E
1999-01-01
The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen time series indicate that the system can be described by a low dimensional chaotic map. We apply the wavelet transform to study the multifractal characteristics of an a irborne pollen time series. We find the persistence behaviour associated to low pollen concentration values and to the most rare events of highest pollen co ncentration values. The information and the correlation dimensions correspond to a chaotic system showing loss of information with time evolution.
Computational based functional analysis of Bacillus phytases.
Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti
2016-02-01
Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917
Face Recognition Based on Principal Component Analysis
Ali Javed
2013-02-01
Full Text Available The purpose of the proposed research work is to develop a computer system that can recognize a person by comparing the characteristics of face to those of known individuals. The main focus is on frontal two dimensional images that are taken in a controlled environment i.e. the illumination and the background will be constant. All the other methods of person’s identification and verification like iris scan or finger print scan require high quality and costly equipment’s but in face recognition we only require a normal camera giving us a 2-D frontal image of the person that will be used for the process of the person’s recognition. Principal Component Analysis technique has been used in the proposed system of face recognition. The purpose is to compare the results of the technique under the different conditions and to find the most efficient approach for developing a facial recognition system
Building Extraction from LIDAR Based Semantic Analysis
YU Jie; YANG Haiquan; TAN Ming; ZHANG Guoning
2006-01-01
Extraction of buildings from LIDAR data has been an active research field in recent years. A scheme for building detection and reconstruction from LIDAR data is presented with an object-oriented method which is based on the buildings' semantic rules. Two key steps are discussed: how to group the discrete LIDAR points into single objects and how to establish the buildings' semantic rules. In the end, the buildings are reconstructed in 3D form and three common parametric building models (flat, gabled, hipped) are implemented.
Trajectory Based Behavior Analysis for User Verification
Pao, Hsing-Kuo; Lin, Hong-Yi; Chen, Kuan-Ta; Fadlil, Junaidillah
Many of our activities on computer need a verification step for authorized access. The goal of verification is to tell apart the true account owner from intruders. We propose a general approach for user verification based on user trajectory inputs. The approach is labor-free for users and is likely to avoid the possible copy or simulation from other non-authorized users or even automatic programs like bots. Our study focuses on finding the hidden patterns embedded in the trajectories produced by account users. We employ a Markov chain model with Gaussian distribution in its transitions to describe the behavior in the trajectory. To distinguish between two trajectories, we propose a novel dissimilarity measure combined with a manifold learnt tuning for catching the pairwise relationship. Based on the pairwise relationship, we plug-in any effective classification or clustering methods for the detection of unauthorized access. The method can also be applied for the task of recognition, predicting the trajectory type without pre-defined identity. Given a trajectory input, the results show that the proposed method can accurately verify the user identity, or suggest whom owns the trajectory if the input identity is not provided.
IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES
纳瑟; 刘重庆
2002-01-01
A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.
The Route Analysis Based On Flight Plan
Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi
2016-02-01
Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.
Network reliability analysis based on percolation theory
In this paper, we propose a new way of looking at the reliability of a network using percolation theory. In this new view, a network failure can be regarded as a percolation process and the critical threshold of percolation can be used as network failure criterion linked to the operational settings under control. To demonstrate our approach, we consider both random network models and real networks with different nodes and/or edges lifetime distributions. We study numerically and theoretically the network reliability and find that the network reliability can be solved as a voting system with threshold given by percolation theory. Then we find that the average lifetime of random network increases linearly with the average lifetime of its nodes with uniform life distributions. Furthermore, the average lifetime of the network becomes saturated when system size is increased. Finally, we demonstrate our method on the transmission network system of IEEE 14 bus. - Highlights: • Based on percolation theory, we address questions of practical interest such as “how many failed nodes/edges will break down the whole network?” • The percolation threshold naturally gives a network failure criterion. • The approach based on percolation theory is suited for calculations of large-scale networks
Design Intelligent Model-free Hybrid Guidance Controller for Three Dimension Motor
Abdol Majid Mirshekaran
2014-10-01
Full Text Available The minimum rule base Proportional Integral Derivative (PID Fuzzy hybrid guidance Controller for three dimensions spherical motor is presented in this research. A three dimensions spherical motor is well equipped with conventional control techniques and, in particular, various PID controllers which demonstrate a good performance and successfully solve different guidance problems. Guidance control in a three dimensions spherical motor is performed by the PID controllers producing the control signals which are applied to systems torque. The necessary reference inputs for a PID controller are usually supplied by the system's sensors based on different data. The popularity of PID Fuzzy hybrid guidance Controller can be attributed to their robust performance in a wide range of operating conditions and partly to their functional simplicity. PID methodology has three inputs and if any input is described with seven linguistic values, and any rule has three conditions we will need 343 rules. It is too much work to write 343 rules. In this research the PID-like fuzzy controller can be constructed as a parallel structure of a PD-like fuzzy controller and a conventional PI controller to have the minimum rule base. Linear type PID controller is used to modify PID fuzzy logic theory to design hybrid guidance methodology. This research is used to reduce or eliminate the fuzzy and conventional PID controller problem based on minimum rule base fuzzy logic theory and modified it by PID method to control of spherical motor system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.
SENTIMENT ANALYSIS OF DOCUMENT BASED ON ANNOTATION
Archana Shukla
2011-11-01
Full Text Available I present a tool which tells the quality of document or its usefulness based on annotations. Annotation mayinclude comments, notes, observation, highlights, underline, explanation, question or help etc. commentsare used for evaluative purpose while others are used for summarization or for expansion also. Furtherthese comments may be on another annotation. Such annotations are referred as meta-annotation. Allannotation may not get equal weightage. My tool considered highlights, underline as well as comments toinfer the collective sentiment of annotators. Collective sentiments of annotators are classified as positive,negative, objectivity. My tool computes collective sentiment of annotations in two manners. It counts all theannotation present on the documents as well as it also computes sentiment scores of all annotation whichincludes comments to obtain the collective sentiments about the document or to judge the quality ofdocument. I demonstrate the use of tool on research paper.
Analysis of Vehicle-Based Security Operations
Carter, Jason M [ORNL; Paul, Nate R [ORNL
2015-01-01
Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be
Simulation based analysis of laser beam brazing
Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael
2016-03-01
Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.
Structural Analysis Using Computer Based Methods
Dietz, Matthew R.
2013-01-01
The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.
Mammogram-based discriminant fusion analysis for breast cancer diagnosis.
Li, Jun-Bao; Wang, Yun-Heng; Tang, Lin-Lin
2012-01-01
Mammogram-based classification is an important and effective way for computer-aided diagnosis (CAD)-based breast cancer diagnosis. In this paper, we present a novel discriminant fusing analysis (DFA)-based mammogram classification CAD-based breast cancer diagnosis. The discriminative breast tissue features are exacted and fused by DFA, and DFA achieves the optimal fusion coefficients. The largest class discriminant in the fused feature space is achieved by DFA for classification. Beside the detailed theory derivation, many experimental evaluations are implemented on Mammography Image Analysis Society mammogram database for breast cancer diagnosis. PMID:23153999
Bayesian analysis for EMP damaged function based on Weibull distribution
Weibull distribution is one of the most commonly used statistical distribution in EMP vulnerability analysis. In the paper, the EMP damage function based on Weibull distribution of solid state relays was solved by bayesian computation using gibbs sampling algorithm. (authors)
Indoor air quality analysis based on Hadoop
The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper
Indoor air quality analysis based on Hadoop
Tuo, Wang; Yunhua, Sun; Song, Tian; Liang, Yu; Weihong, Cui
2014-03-01
The air of the office environment is our research object. The data of temperature, humidity, concentrations of carbon dioxide, carbon monoxide and ammonia are collected peer one to eight seconds by the sensor monitoring system. And all the data are stored in the Hbase database of Hadoop platform. With the help of HBase feature of column-oriented store and versioned (automatically add the time column), the time-series data sets are bulit based on the primary key Row-key and timestamp. The parallel computing programming model MapReduce is used to process millions of data collected by sensors. By analysing the changing trend of parameters' value at different time of the same day and at the same time of various dates, the impact of human factor and other factors on the room microenvironment is achieved according to the liquidity of the office staff. Moreover, the effective way to improve indoor air quality is proposed in the end of this paper.
Surveillance data bases, analysis, and standardization program
Kam, F.B.K.
1990-09-26
The traveler presented a paper at the Seventh ASTM-EURATOM Symposium on Reactor Dosimetry and co-chaired an oral session on Computer Codes and Methods. Papers of considerable interest to the NRC Surveillance Dosimetry Program involved statistically based adjustment procedures and uncertainties. The information exchange meetings with Czechoslovakia and Hungary were very enlightening. Lack of large computers have hindered their surveillance program. They depended very highly on information from their measurement programs which were somewhat limited because of the lack of sophisticated electronics. The Nuclear Research Institute at Rez had to rely on expensive mockups of power reactor configurations to test their fluence exposures. Computers, computer codes, and updated nuclear data would advance their technology rapidly, and they were not hesitant to admit this fact. Both eastern-bloc countries said that IBM is providing an IBM 3090 for educational purposes but research and development studies would have very limited access. They were very apologetic that their currencies were not convertible, and any exchange means that they could provide services or pay for US scientists in their respective countries, but funding for their scientists in the United States, or expenses that involved payment in dollars, must come from us.
Activation analysis based on secondary nuclear reactions
Various types of analytical techniques founded on achievements of nuclear physics are used. There are two directions of the using of the main sources of the nuclear projectiles at development of the nuclear methods. In the first, the particles from the source are used directly for the excitation of nuclear reactions. In the second, the particles from the source are used for the generating of intermediate particles of other types which are used in turn for excitation of secondary nuclear reactions. In our research the neutrons are used for the generating of secondary charged particles which serve for excitation of nuclear reactions on elements with small atomic numbers. There are two variants in which both types of neutrons, as thermal, so and fast neutrons are used: 1) The triton flow is produced by thermal neutrons flux, which excites the nuclear reaction 6Li(n, α)T on lithium; 2) The recoil protons are produced as the result of (n, p) elastic or inelastic scattering interaction of fast neutrons with nucleus of light elements, for example, hydrogen. In this work the theoretical base of the application of secondary nuclear reactions excited by recoil protons was investigated
On spectral methods for variance based sensitivity analysis
Alexanderian, Alen
2013-01-01
Consider a mathematical model with a finite number of random parameters. Variance based sensitivity analysis provides a framework to characterize the contribution of the individual parameters to the total variance of the model response. We consider the spectral methods for variance based sensitivity analysis which utilize representations of square integrable random variables in a generalized polynomial chaos basis. Taking a measure theoretic point of view, we provide a rigorous and at the sam...
Architecture Level Dependency Analysis of SOA Based System through ?-Adl
Pawan Kumar; Ratneshwer
2016-01-01
A formal Architecture Description Language (ADL) provides an effective way to dependency analysis at early stage of development. ?-ADL is an ADL that represents the static and dynamic features of software services. In this paper, we describe an approach of dependency analysis of SOA (Service Oriented Architecture) based system, at architecture level, through ?-ADL. A set of algorithms are also proposed for identification of dependency relationships from a SOA based system. The proposed algori...
Toward farm-based policy analysis: concepts applied in Haiti
Martinez, Juan Carlos; Sain, Gustavo; Yates, Michael
1991-01-01
Many policies - on the delivery of inputs or on marketing systems, credit, or extension - influence the potential utilization of new technologies. Through 'farm-based policy analysis' it is possible to use data generated in on-farm research (OFR) to identify policy constraints to the use of new technologies, and to effectively communicate that information to policy makers. This paper describes a tentative framework for farm-based policy analysis and suggests a sequence of five steps for the a...
Multilevel Solvers with Aggregations for Voxel Based Analysis of Geomaterials
Blaheta, R. (Radim); V. Sokol
2012-01-01
Our motivation for voxel based analysis comes from the investigation of geomaterials (geocomposites) arising from rock grouting or sealing. We use finite element analysis based on voxel data from tomography. The arising finite element systems are large scale, which motivates the use of multilevel iterative solvers or preconditioners. Among others we concentrate on multilevel Schwarz preconditioners with aggregations. The aggregations are efficient even in the case of problems with hete...
Product Profitability Analysis Based on EVA and ABC
Chen Lin; Shuangyuan Wang; Zhilin Qiao
2013-01-01
On the purpose of maximizing shareholders’ value, the profitability analysis established on the basis oftraditional accounting earnings cannot meet the demands of providing accurate decision-making information forenterprises. Therefore, this paper implements the Activity Based Costing (ABC) and the Economic Value Added(EVA) into the traditional profitability analysis system, sets up an improved EVA-ABC based profitabilityanalysis system as well as its relative indexes, and applies it to the s...
Empirical validation and comparison of models for customer base analysis
Persentili Batislam, Emine; Denizel, Meltem; Filiztekin, Alpay
2007-01-01
The benefits of retaining customers lead companies to search for means to profile their customers individually and track their retention and defection behaviors. To this end, the main issues addressed in customer base analysis are identification of customer active/inactive status and prediction of future purchase levels. We compare the predictive performance of Pareto/NBD and BG/NBD models from the customer base analysis literature — in terms of repeat purchase levels and active status — usi...
Discrete Discriminant analysis based on tree-structured graphical models
Perez de la Cruz, Gonzalo; Eslava, Guillermina
The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant a...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....
Method for detecting software anomalies based on recurrence plot analysis
Michał Mosdorf
2012-03-01
Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.
PYTHON-based Physics Analysis Environment for LHCb
Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E
2004-01-01
BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.
Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.
Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-07-01
This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).
An IBM-PC based reactor neutronics analysis package
The development of a comprehensive system of microcomputer-based codes suitable for neutronics and shielding analysis of nuclear reactors has been undertaken by EGandG Idaho, Inc., at the Idaho National Engineering Laboratory (INEL). This system has been designed for cross section generation, one-dimensional discrete-ordinates analysis, one- two- and three-dimensional diffusion theory analysis, and various other radiation transport applications of interest
Benefits of Computer Based Content Analysis to Foresight
Kováříková, Ludmila; Grosová, Stanislava
2014-01-01
Purpose of the article: The present manuscript summarizes benefits of the use of computer-based content analysis in a generation phase of foresight initiatives. Possible advantages, disadvantages and limitations of the content analysis for the foresight projects are discussed as well. Methodology/methods: In order to specify the benefits and identify the limitations of the content analysis within the foresight, results of the generation phase of a particular foresight project perf...
Finding the dynamics of an entire macromolecule is a complex problem as the model-free parameter values are intricately linked to the Brownian rotational diffusion of the molecule, mathematically through the autocorrelation function of the motion and statistically through model selection. The solution to this problem was formulated using set theory as an element of the universal set U-the union of all model-free spaces (d'Auvergne EJ and Gooley PR (2007) Mol BioSyst 3(7), 483-494). The current procedure commonly used to find the universal solution is to initially estimate the diffusion tensor parameters, to optimise the model-free parameters of numerous models, and then to choose the best model via model selection. The global model is then optimised and the procedure repeated until convergence. In this paper a new methodology is presented which takes a different approach to this diffusion seeded model-free paradigm. Rather than starting with the diffusion tensor this iterative protocol begins by optimising the model-free parameters in the absence of any global model parameters, selecting between all the model-free models, and finally optimising the diffusion tensor. The new model-free optimisation protocol will be validated using synthetic data from Schurr JM et al. (1994) J Magn Reson B 105(3), 211-224 and the relaxation data of the bacteriorhodopsin (1-36)BR fragment from Orekhov VY (1999) J Biomol NMR 14(4), 345-356. To demonstrate the importance of this new procedure the NMR relaxation data of the Olfactory Marker Protein (OMP) of Gitti R et al. (2005) Biochem 44(28), 9673-9679 is reanalysed. The result is that the dynamics for certain secondary structural elements is very different from those originally reported
Geometric-model-free tracking of extended targets using 3D lidar measurements
Steinemann, Philipp; Klappstein, Jens; Dickmann, Juergen; von Hundelshausen, Felix; Wünsche, Hans-Joachim
2012-06-01
Tracking of extended targets in high definition, 360-degree 3D-LIDAR (Light Detection and Ranging) measurements is a challenging task and a current research topic. It is a key component in robotic applications, and is relevant to path planning and collision avoidance. This paper proposes a new method without a geometric model to simultaneously track and accumulate 3D-LIDAR measurements of an object. The method itself is based on a particle filter and uses an object-related local 3D grid for each object. No geometric object hypothesis is needed. Accumulation allows coping with occlusions. The prediction step of the particle filter is governed by a motion model consisting of a deterministic and a probabilistic part. Since this paper is focused on tracking ground vehicles, a bicycle model is used for the deterministic part. The probabilistic part depends on the current state of each particle. A function for calculating the current probability density function for state transition is developed. It is derived in detail and based on a database consisting of vehicle dynamics measurements over several hundreds of kilometers. The adaptive probability density function narrows down the gating area for measurement data association. The second part of the proposed method addresses weighting the particles with a cost function. Different 3D-griddependent cost functions are presented and evaluated. Evaluations with real 3D-LIDAR measurements show the performance of the proposed method. The results are also compared to ground truth data.
Some Linguistic-based and temporal analysis on Wikipedia
Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)
Alignment analysis of urban railways based on passenger travel demand
Andersen, Jonas Lohmann Elkjær; Landex, Alex
2010-01-01
, this article presents a computerised GIS based methodology that can be used as decision support for selecting the best alignment. The methodology calculates travel potential within defined buffers surrounding the alignment. The methodology has three different approaches depending on the desired level of detail......: the simple but straight-forward to implement line potential approach that perform corridor analysis, the detailed catchment area analysis based on stops on the alignment and the refined service area analysis that uses search distances in street networks. All three approaches produce trustworthy results...
Chemical Cytometry: Fluorescence-Based Single-Cell Analysis
Cohen, Daniella; Dickerson, Jane A.; Whitmore, Colin D.; Turner, Emily H.; Palcic, Monica M.; Hindsgaul, Ole; Dovichi, Norman J.
2008-07-01
Cytometry deals with the analysis of the composition of single cells. Flow and image cytometry employ antibody-based stains to characterize a handful of components in single cells. Chemical cytometry, in contrast, employs a suite of powerful analytical tools to characterize a large number of components. Tools have been developed to characterize nucleic acids, proteins, and metabolites in single cells. Whereas nucleic acid analysis employs powerful polymerase chain reaction-based amplification techniques, protein and metabolite analysis tends to employ capillary electrophoresis separation and ultrasensitive laser-induced fluorescence detection. It is now possible to detect yoctomole amounts of many analytes in single cells.
Model-free information-theoretic approach to infer leadership in pairs of zebrafish
Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio
2016-04-01
Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.
Fatigue analysis of steam generator cassette parts based on CAE
Fatigue analysis has been performed for steam generator nozzle header and tube based on CAE. Three dimensional model was produced using the commercial CAD program, IDEAS and the geometry and boundary condition information have been transformed into input format of ABAQUS for thermal analysis, stress analysis, and fatigue analysis. Cassette nozzle, which has a complex geometry, has been analysed by using the three dimensional model. But steam generator tube has been analysed according to ASME procedure since it can be modelled as a two dimensional finite element model. S-N curve for the titanium alloy of the steam generator tube material was obtained from the material tests. From the analysis, it has been confirmed that these parts of the steam generator cassette satisfy the lifetime of the steam generator cassette. Three dimensional modelling strategy from the thermal analysis to fatigue analysis should be implemented into the design of reactor major components to enhance the efficiency of design procedure
Statistical analysis of MRI-only based dose planning
Korsholm, M. E.; Waring, L. W.; Paulsen, Rasmus Reinhold; Edmund, J. M.
2 %. Conclusions: The investigated DVH points show that MRIonly based RT seems to be a feasible alternative to CT based RT. However, the analysis only describes similarities in DVH points and not in the shape of the DVH. Even though the mean differences are nonsignificant there might be...
Agent-based analysis of organizations : formalization and simulation
Dignum, M.V.; Tick, C.
2008-01-01
Organizational effectiveness depends on many factors, including individual excellence, efficient structures, effective planning and capability to understand and match context requirements. We propose a way to model organizational performance based on a combination of formal models and agent-based simulation that supports the analysis of the congruence of different organizational structures to changing environments
Reliability-Based Robustness Analysis for a Croatian Sports Hall
Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;
2011-01-01
complex timber structure with a large number of failure modes is modelled with only a few dominant failure modes. First, a component based robustness analysis is performed based on the reliability indices of the remaining elements after the removal of selected critical elements. The robustness is...
An Analysis of an Improved Bus-Based Multiprocessor Architecture
Ricks, Kenneth G.; Wells, B. Earl
1998-01-01
This paper analyses the effectiveness of a hybrid multiprocessing/multicomputing architecture that is based upon a single-board-computer multiprocessor (SBCM) architecture. Based upon empirical analysis using discrete event simulations and Monte Carlo techniques, this hybrid architecture, called the enhanced single-board-computer multiprocessor (ESBCM), is shown to have improved performance and scalability characteristics over current SBCM designs.
Differential Regulatory Analysis Based on Coexpression Network in Cancer Research.
Li, Junyi; Li, Yi-Xue; Li, Yuan-Yuan
2016-01-01
With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA) based on gene coexpression network (GCN) increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies. PMID:27597964
Differential Regulatory Analysis Based on Coexpression Network in Cancer Research
Junyi Li
2016-01-01
Full Text Available With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA based on gene coexpression network (GCN increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies.
Kernel-Based Nonlinear Discriminant Analysis for Face Recognition
LIU QingShan (刘青山); HUANG Rui (黄锐); LU HanQing (卢汉清); MA SongDe (马颂德)
2003-01-01
Linear subspace analysis methods have been successfully applied to extract features for face recognition. But they are inadequate to represent the complex and nonlinear variations of real face images, such as illumination, facial expression and pose variations, because of their linear properties. In this paper, a nonlinear subspace analysis method, Kernel-based Nonlinear Discriminant Analysis (KNDA), is presented for face recognition, which combines the nonlinear kernel trick with the linear subspace analysis method - Fisher Linear Discriminant Analysis (FLDA).First, the kernel trick is used to project the input data into an implicit feature space, then FLDA is performed in this feature space. Thus nonlinear discriminant features of the input data are yielded. In addition, in order to reduce the computational complexity, a geometry-based feature vectors selection scheme is adopted. Another similar nonlinear subspace analysis is Kernel-based Principal Component Analysis (KPCA), which combines the kernel trick with linear Principal Component Analysis (PCA). Experiments are performed with the polynomial kernel, and KNDA is compared with KPCA and FLDA. Extensive experimental results show that KNDA can give a higher recognition rate than KPCA and FLDA.
Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes
Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G.
2016-01-01
Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367
Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.
Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G
2016-01-01
Although reinforcement learning (RL) is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D). In this approach, an Actor-Critic (AC) learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT) from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI). The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI. PMID:27441367
Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.
Elena Daskalaki
Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.
A model-free definition of coupling strength for assessing the influence between climatic processes
Runge, J.; Kurths, J.
2012-04-01
Assessing the strength of influence between climatic processes from observational data is an important problem on the way to construct conceptual models or make predictions. An example being the influence of ENSO on the Indian Monsoon compared to the influence of other climatic processes. It is an especially difficult task if the interactions are nonlinear where linear measures like the Pearson correlation coefficient fail. Apart from nonlinearity, auto-dependencies in the processes can lead to misleading high values of coupling strength. There exist statistical methods that address these issues, but most of them assume some model, e.g., a linear model in the case of the partial correlation. We propose a measure based on conditional mutual information that makes no assumptions on the underlying model and is able to exclude auto-dependencies and even influences of external processes. We investigate how the strength measured relates to model systems where a coupling strength is known and discuss its limitations. The measure is applied to time series of different climate indices and gridded data sets to gain insights into the coupling strength between climatic teleconnections. Applied to more than two time series it is also able to shed light on mechanisms of interactions between multiple processes.
Scatter to volume registration for model-free respiratory motion estimation from dynamic MRIs.
Miao, S; Wang, Z J; Pan, L; Butler, J; Moran, G; Liao, R
2016-09-01
Respiratory motion is one major complicating factor in many image acquisition applications and image-guided interventions. Existing respiratory motion estimation and compensation methods typically rely on breathing motion models learned from certain training data, and therefore may not be able to effectively handle intra-subject and/or inter-subject variations of respiratory motion. In this paper, we propose a respiratory motion compensation framework that directly recovers motion fields from sparsely spaced and efficiently acquired dynamic 2-D MRIs without using a learned respiratory motion model. We present a scatter-to-volume deformable registration algorithm to register dynamic 2-D MRIs with a static 3-D MRI to recover dense deformation fields. Practical considerations and approximations are provided to solve the scatter-to-volume registration problem efficiently. The performance of the proposed method was investigated on both synthetic and real MRI datasets, and the results showed significant improvements over the state-of-art respiratory motion modeling methods. We also demonstrated a potential application of the proposed method on MRI-based motion corrected PET imaging using hybrid PET/MRI. PMID:27180910
Affine invariant texture analysis based on structural properties
Zhang, Jianguo; Tan, Tieniu
2002-01-01
This paper presents a new texture analysis method based on structural properties. The texture features extracted using this algorithm are invariant to affine transform (including rotation, translation, scaling, and skewing). Affine invariant structural properties are derived based on texel areas. An area-ratio map utilizing these properties is introduced to characterize texture images. Histogram based on this map is constructed for texture classification. Efficiency of this algorithm for affi...
Computer-based modelling and analysis in engineering geology
Giles, David
2014-01-01
This body of work presents the research and publications undertaken under a general theme of computer-based modelling and analysis in engineering geology. Papers are presented on geotechnical data management, data interchange, Geographical Information Systems, surface modelling, geostatistical methods, risk-based modelling, knowledge-based systems, remote sensing in engineering geology and on the integration of computer applications into applied geoscience teaching. The work highlights my...
Earthquake Analysis of Structure by Base Isolation Technique in SAP
T. Subramani; J. Jothi
2014-01-01
This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...
Abnormal traffic flow data detection based on wavelet analysis
Xiao Qian
2016-01-01
Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.
Adaptive Fourier Decomposition Based Time-Frequency Analysis
Li-Ming Zhang
2014-01-01
The attempt to represent a signal simultaneously in time and frequency domains is full of challenges. The recently proposed adaptive Fourier decomposition (AFD) offers a practical approach to solve this problem. This paper presents the principles of the AFD based time-frequency analysis in three aspects: instantaneous frequency analysis, frequency spectrum analysis, and the spectrogram analysis. An experiment is conducted and compared with the Fourier transform in convergence rate and short-time Fourier transform in time-frequency distribution. The proposed approach performs better than both the Fourier transform and short-time Fourier transform.
A scenario-based procedure for seismic risk analysis
A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)
Analysis of security protocols based on challenge-response
LUO JunZhou; YANG Ming
2007-01-01
Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.
Virtual stress amplitude-based low cycle fatigue reliability analysis
A method for virtual stress amplitude-based low cycle fatigue reliability analysis is developed. Different from existent methods, probability-based modified Ramberg-Osgood stress-strain relations (P-ε-σ curves) are newly introduced to take into account the scatter of stress-strain responses, where the metallurgical quality of material is not enough good i.e. weld metal to show a same stress-strain response for different specimens under same loading level. In addition, a virtual stress amplitude-based analysis is used to be in agreement with the existent codes for nuclear components. i.e. ASME section III. The analysis is performed by a principle of the stochastic analysis system in same safety level concurrently. Combined the probability-based modified Ramberg-Osgood stress-strain relations, the probability-based Langer S-N curves (P-S-N curves) and the Neuber's local stress-strain rule, the method can be applied to predict the fatigue life at specified reliability and loading history and to estimate the reliability at specified loading history and expectation fatigue life. Applicability of the method has been indicated by a test analysis of 1Cr18Ni9ti steel-weld metal, which was used for machining the pipes of some nuclear reactors, during low cycle fatigue
Bayesian-network-based safety risk analysis in construction projects
This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical
A Framework for Web-Based Mechanical Design and Analysis
Chiaming; Yen; Wujeng; Li
2002-01-01
In this paper, a Web-based Mechanical Design and A na lysis Framework (WMDAF) is proposed. This WMADF allows designers to develop web -based computer aided programs in a systematic way during the collaborative mec hanical system design and analysis process. This system is based on an emerg ing web-based Content Management System (CMS) called eXtended Object Oriented P ortal System (XOOPS). Due to the Open Source Status of the XOOPS CMS, programs d eveloped with this framework can be further customized to ...
Tikhonov regularization-based operational transfer path analysis
Cheng, Wei; Lu, Yingying; Zhang, Zhousuo
2016-06-01
To overcome ill-posed problems in operational transfer path analysis (OTPA), and improve the stability of solutions, this paper proposes a novel OTPA based on Tikhonov regularization, which considers both fitting degrees and stability of solutions. Firstly, fundamental theory of Tikhonov regularization-based OTPA is presented, and comparative studies are provided to validate the effectiveness on ill-posed problems. Secondly, transfer path analysis and source contribution evaluations for numerical cases studies on spherical radiating acoustical sources are comparatively studied. Finally, transfer path analysis and source contribution evaluations for experimental case studies on a test bed with thin shell structures are provided. This study provides more accurate transfer path analysis for mechanical systems, which can benefit for vibration reduction by structural path optimization. Furthermore, with accurate evaluation of source contributions, vibration monitoring and control by active controlling vibration sources can be effectively carried out.
NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS
无
2003-01-01
In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.
Automatic Video-based Analysis of Human Motion
Fihl, Preben
. A multi-view approach to pose estimation is also presented that integrates low level information from different cameras to generate better pose estimates during heavy occlusions. The works presented in this thesis contribute in these different areas of video-based analysis of human motion and altogether......The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...
Data Warehouse Requirements Analysis Framework: Business-Object Based Approach
Anirban Sarkar
2012-01-01
Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi
Similar words analysis based on POS-CBOW language model
Dongru RUAN
2015-10-01
Full Text Available Similar words analysis is one of the important aspects in the field of natural language processing, and it has important research and application values in text classification, machine translation and information recommendation. Focusing on the features of Sina Weibo's short text, this paper presents a language model named as POS-CBOW, which is a kind of continuous bag-of-words language model with the filtering layer and part-of-speech tagging layer. The proposed approach can adjust the word vectors' similarity according to the cosine similarity and the word vectors' part-of-speech metrics. It can also filter those similar words set on the base of the statistical analysis model. The experimental result shows that the similar words analysis algorithm based on the proposed POS-CBOW language model is better than that based on the traditional CBOW language model.
Applying measurement-based probabilistic timing analysis to buffer resources
Kosmidis L.; Vardanega T.; Abella J.; Quinones E.; Cazorla F.J.
2013-01-01
The use of complex hardware makes it difficult for current timing analysis techniques to compute trustworthy and tight worst-case execution time (WCET) bounds. Those techniques require detailed knowledge of the internal operation and state of the platform, at both the software and hardware level. Obtaining that information for modern hardware platforms is increasingly difficult. Measurement-Based Probabilistic Timing Analysis (MBPTA) reduces the cost of acquiring the knowledge needed for comp...
UML based risk analysis - Application to a medical robot
Guiochet, Jérémie; Baron, Claude
2004-01-01
Medical robots perform complex tasks and share their working area with humans. Therefore , they belong to safety critical systems. In nowadays development process, safety is often managed by the way of dependability techniques. We propose a new global approach , based on the risk concept in order to guide designers along the safety analysis of such complex systems. Safety depends on risk management activity, which core is risk analysis. This one consists in three steps: system definition, haz...
Study of engine noise based on independent component analysis
HAO Zhi-yong; JIN Yan; YANG Chen
2007-01-01
Independent component analysis was applied to analyze the acoustic signals from diesel engine. First the basic principle of independent component analysis (ICA) was reviewed. Diesel engine acoustic signal was decomposed into several independent components (Ics); Fourier transform and continuous wavelet transform (CWT) were applied to analyze the independent components. Different noise sources of the diesel engine were separated, based on the characteristics of different component in time-frequency domain.
FDTD method based electromagnetic solver for printed-circuit analysis
Gnilenko, Alexey B.; Paliy, Oleg V.
2003-01-01
An electromagnetic solver for printed-circuit analysis is presented. The electromagnetic simulator is based on the finite-difference time-domain method with first-order Mur's absorbing boundary conditions. The solver environment comprises a layout graphic editor for circuit topology preparation and a data postprocessor for presenting the calculation results. The solver has been applied to the analysis of printed-circuit components such as printed antenna, microstrip discontinuities, etc.