WorldWideScience

Sample records for based model-free analysis

  1. Comparing model-based adaptive LMS filters and a model-free hysteresis loop analysis method for structural health monitoring

    Science.gov (United States)

    Zhou, Cong; Chase, J. Geoffrey; Rodgers, Geoffrey W.; Xu, Chao

    2017-02-01

    The model-free hysteresis loop analysis (HLA) method for structural health monitoring (SHM) has significant advantages over the traditional model-based SHM methods that require a suitable baseline model to represent the actual system response. This paper provides a unique validation against both an experimental reinforced concrete (RC) building and a calibrated numerical model to delineate the capability of the model-free HLA method and the adaptive least mean squares (LMS) model-based method in detecting, localizing and quantifying damage that may not be visible, observable in overall structural response. Results clearly show the model-free HLA method is capable of adapting to changes in how structures transfer load or demand across structural elements over time and multiple events of different size. However, the adaptive LMS model-based method presented an image of greater spread of lesser damage over time and story when the baseline model is not well defined. Finally, the two algorithms are tested over a simpler hysteretic behaviour typical steel structure to quantify the impact of model mismatch between the baseline model used for identification and the actual response. The overall results highlight the need for model-based methods to have an appropriate model that can capture the observed response, in order to yield accurate results, even in small events where the structure remains linear.

  2. Model-free linkage analysis of a binary trait.

    Science.gov (United States)

    Xu, Wei; Bull, Shelley B; Mirea, Lucia; Greenwood, Celia M T

    2012-01-01

    Genetic linkage analysis aims to detect chromosomal regions containing genes that influence risk of specific inherited diseases. The presence of linkage is indicated when a disease or trait cosegregates through the families with genetic markers at a particular region of the genome. Two main types of genetic linkage analysis are in common use, namely model-based linkage analysis and model-free linkage analysis. In this chapter, we focus solely on the latter type and specifically on binary traits or phenotypes, such as the presence or absence of a specific disease. Model-free linkage analysis is based on allele-sharing, where patterns of genetic similarity among affected relatives are compared to chance expectations. Because the model-free methods do not require the specification of the inheritance parameters of a genetic model, they are preferred by many researchers at early stages in the study of a complex disease. We introduce the history of model-free linkage analysis in Subheading 1. Table 1 describes a standard model-free linkage analysis workflow. We describe three popular model-free linkage analysis methods, the nonparametric linkage (NPL) statistic, the affected sib-pair (ASP) likelihood ratio test, and a likelihood approach for pedigrees. The theory behind each linkage test is described in this section, together with a simple example of the relevant calculations. Table 4 provides a summary of popular genetic analysis software packages that implement model-free linkage models. In Subheading 2, we work through the methods on a rich example providing sample software code and output. Subheading 3 contains notes with additional details on various topics that may need further consideration during analysis.

  3. XPC Ala499Val and XPG Asp1104His polymorphisms and digestive system cancer risk: a meta-analysis based on model-free approach.

    Science.gov (United States)

    Yu, Guangsheng; Wang, Jianlu; Dong, Jiahong; Liu, Jun

    2015-01-01

    Many studies have reported the association between XPC Ala499Val and XPG Asp1104His polymorphisms and digestive system cancer susceptibility, but the results were inconclusive. We performed a meta-analysis, using a comprehensive strategy based on the allele model and a model-free approach, to derive a more precise estimation of the relationship between XPC Ala499Val and XPG Asp1104His polymorphisms with digestive system cancer risk. For XPC Ala499Val, no significant cancer risk was found in the allele model (OR = 0.98, 95% CI: 0.86-1.11) and with model-free approach (ORG = 0.97, 95% CI: 0.83-1.13). For XPG Asp1104His, there was also no association between this polymorphism and cancer risk in the allele model (OR = 1.03, 95% CI: 0.96-1.11) and with the model-free approach (ORG = 1.04, 95% CI: 0.95-1.14). Therefore, this meta-analysis suggests that the XPC Ala499Val and XPG Asp1104His polymorphisms were not associated with digestive system cancer risk. Further large and well-designed studies are needed to confirm these findings.

  4. Model-free data analysis for source separation based on Non-Negative Matrix Factorization and k-means clustering (NMFk)

    Science.gov (United States)

    Vesselinov, V. V.; Alexandrov, B.

    2014-12-01

    The identification of the physical sources causing spatial and temporal fluctuations of state variables such as river stage levels and aquifer hydraulic heads is challenging. The fluctuations can be caused by variations in natural and anthropogenic sources such as precipitation events, infiltration, groundwater pumping, barometric pressures, etc. The source identification and separation can be crucial for conceptualization of the hydrological conditions and characterization of system properties. If the original signals that cause the observed state-variable transients can be successfully "unmixed", decoupled physics models may then be applied to analyze the propagation of each signal independently. We propose a new model-free inverse analysis of transient data based on Non-negative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the system. A classical BSS conundrum is the so-called "cocktail-party" problem where several microphones are recording the sounds in a ballroom (music, conversations, noise, etc.). Each of the microphones is recording a mixture of the sounds. The goal of BSS is to "unmix'" and reconstruct the original sounds from the microphone records. Similarly to the "cocktail-party" problem, our model-freee analysis only requires information about state-variable transients at a number of observation points, m, where m > r, and r is the number of unknown unique sources causing the observed fluctuations. We apply the analysis on a dataset from the Los Alamos National Laboratory (LANL) site. We identify and estimate the impact and sources are barometric pressure and water-supply pumping effects. We also estimate the

  5. Model-free execution monitoring in behavior-based robotics.

    Science.gov (United States)

    Pettersson, Ola; Karlsson, Lars; Saffiotti, Alessandro

    2007-08-01

    In the near future, autonomous mobile robots are expected to help humans by performing service tasks in many different areas, including personal assistance, transportation, cleaning, mining, or agriculture. In order to manage these tasks in a changing and partially unpredictable environment without the aid of humans, the robot must have the ability to plan its actions and to execute them robustly and safely. The robot must also have the ability to detect when the execution does not proceed as planned and to correctly identify the causes of the failure. An execution monitoring system allows the robot to detect and classify these failures. Most current approaches to execution monitoring in robotics are based on the idea of predicting the outcomes of the robot's actions by using some sort of predictive model and comparing the predicted outcomes with the observed ones. In contrary, this paper explores the use of model-free approaches to execution monitoring, that is, approaches that do not use predictive models. In this paper, we show that pattern recognition techniques can be applied to realize model-free execution monitoring by classifying observed behavioral patterns into normal or faulty execution. We investigate the use of several such techniques and verify their utility in a number of experiments involving the navigation of a mobile robot in indoor environments.

  6. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Simon Benjaminsson

    2010-08-01

    Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.

  7. Quasi-Monte Carlo based global uncertainty and sensitivity analysis in modeling free product migration and recovery from petroleum-contaminated aquifers.

    Science.gov (United States)

    He, Li; Huang, Gordon; Lu, Hongwei; Wang, Shuo; Xu, Yi

    2012-06-15

    This paper presents a global uncertainty and sensitivity analysis (GUSA) framework based on global sensitivity analysis (GSA) and generalized likelihood uncertainty estimation (GLUE) methods. Quasi-Monte Carlo (QMC) is employed by GUSA to obtain realizations of uncertain parameters, which are then input to the simulation model for analysis. Compared to GLUE, GUSA can not only evaluate global sensitivity and uncertainty of modeling parameter sets, but also quantify the uncertainty in modeling prediction sets. Moreover, GUSA's another advantage lies in alleviation of computational effort, since those globally-insensitive parameters can be identified and removed from the uncertain-parameter set. GUSA is applied to a practical petroleum-contaminated site in Canada to investigate free product migration and recovery processes under aquifer remediation operations. Results from global sensitivity analysis show that (1) initial free product thickness has the most significant impact on total recovery volume but least impact on residual free product thickness and recovery rate; (2) total recovery volume and recovery rate are sensitive to residual LNAPL phase saturations and soil porosity. Results from uncertainty predictions reveal that the residual thickness would remain high and almost unchanged after about half-year of skimmer-well scheme; the rather high residual thickness (0.73-1.56 m 20 years later) indicates that natural attenuation would not be suitable for the remediation. The largest total recovery volume would be from water pumping, followed by vacuum pumping, and then skimmer. The recovery rates of the three schemes would rapidly decrease after 2 years (less than 0.05 m(3)/day), thus short-term remediation is not suggested.

  8. Neural computations underlying arbitration between model-based and model-free learning.

    Science.gov (United States)

    Lee, Sang Wan; Shimojo, Shinsuke; O'Doherty, John P

    2014-02-01

    There is accumulating neural evidence to support the existence of two distinct systems for guiding action selection, a deliberative "model-based" and a reflexive "model-free" system. However, little is known about how the brain determines which of these systems controls behavior at one moment in time. We provide evidence for an arbitration mechanism that allocates the degree of control over behavior by model-based and model-free systems as a function of the reliability of their respective predictions. We show that the inferior lateral prefrontal and frontopolar cortex encode both reliability signals and the output of a comparison between those signals, implicating these regions in the arbitration process. Moreover, connectivity between these regions and model-free valuation areas is negatively modulated by the degree of model-based control in the arbitrator, suggesting that arbitration may work through modulation of the model-free valuation system when the arbitrator deems that the model-based system should drive behavior.

  9. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task.

    Science.gov (United States)

    Skatova, Anya; Chan, Patricia A; Daw, Nathaniel D

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs. another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans' scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.

  10. Model-free analysis for large proteins at high magnetic field strengths.

    Science.gov (United States)

    Chang, Shou-Lin; Hinck, Andrew P; Ishima, Rieko

    2007-08-01

    Protein backbone dynamics is often characterized using model-free analysis of three sets of (15)N relaxation data: longitudinal relaxation rate (R1), transverse relaxation rate (R2), and (15)N-{H} NOE values. Since the experimental data is limited, a simplified model-free spectral density function is often used that contains one Lorentzian describing overall rotational correlation but not one describing internal motion. The simplified spectral density function may be also used in estimating the overall rotational correlation time, by making the R2/R1 largely insensitive to internal motions, as well as used as one of the choices in the model selection protocol. However, such approximation may not be valid for analysis of relaxation data of large proteins recorded at high magnetic field strengths since the contribution to longitudinal relaxation from the Lorentzian describing the overall rotational diffusion of the molecule is comparably small relative to that describing internal motion. Here, we quantitatively estimate the errors introduced by the use of the simplified spectral density in model-free analysis for large proteins at high magnetic field strength.

  11. Model-free stochastic processes studied with q-wavelet-based informational tools

    Energy Technology Data Exchange (ETDEWEB)

    Perez, D.G. [Instituto de Fisica, Pontificia Universidad Catolica de Valparaiso (PUCV), 23-40025 Valparaiso (Chile)]. E-mail: dario.perez@ucv.cl; Zunino, L. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Ciencias Basicas, Facultad de Ingenieria, Universidad Nacional de La Plata (UNLP), 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: lucianoz@ciop.unlp.edu.ar; Martin, M.T. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: mtmartin@venus.unlp.edu.ar; Garavaglia, M. [Centro de Investigaciones Opticas, C.C. 124 Correo Central, 1900 La Plata (Argentina) and Departamento de Fisica, Facultad de Ciencias Exactas, Universidad Nacional de La Plata, 1900 La Plata (Argentina)]. E-mail: garavagliam@ciop.unlp.edu.ar; Plastino, A. [Instituto de Fisica (IFLP), Facultad de Ciencias Exactas, Universidad Nacional de La Plata and Argentina' s National Council (CONICET), C.C. 727, 1900 La Plata (Argentina)]. E-mail: plastino@venus.unlp.edu.ar; Rosso, O.A. [Chaos and Biology Group, Instituto de Calculo, Facultad de Ciencias Exactas y Naturales, Universidad de Buenos Aires, Pabellon II, Ciudad Universitaria, 1428 Ciudad de Buenos Aires (Argentina)]. E-mail: oarosso@fibertel.com.ar

    2007-04-30

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework.

  12. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    Directory of Open Access Journals (Sweden)

    Anya eSkatova

    2013-09-01

    Full Text Available Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, versus another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans’ scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.

  13. Model free approach to kinetic analysis of real-time hyperpolarized 13C magnetic resonance spectroscopy data.

    Directory of Open Access Journals (Sweden)

    Deborah K Hill

    Full Text Available Real-time detection of the rates of metabolic flux, or exchange rates of endogenous enzymatic reactions, is now feasible in biological systems using Dynamic Nuclear Polarization Magnetic Resonance. Derivation of reaction rate kinetics from this technique typically requires multi-compartmental modeling of dynamic data, and results are therefore model-dependent and prone to misinterpretation. We present a model-free formulism based on the ratio of total areas under the curve (AUC of the injected and product metabolite, for example pyruvate and lactate. A theoretical framework to support this novel analysis approach is described, and demonstrates that the AUC ratio is proportional to the forward rate constant k. We show that the model-free approach strongly correlates with k for whole cell in vitro experiments across a range of cancer cell lines, and detects response in cells treated with the pan-class I PI3K inhibitor GDC-0941 with comparable or greater sensitivity. The same result is seen in vivo with tumor xenograft-bearing mice, in control tumors and following drug treatment with dichloroacetate. An important finding is that the area under the curve is independent of both the input function and of any other metabolic pathways arising from the injected metabolite. This model-free approach provides a robust and clinically relevant alternative to kinetic model-based rate measurements in the clinical translation of hyperpolarized (13C metabolic imaging in humans, where measurement of the input function can be problematic.

  14. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    Directory of Open Access Journals (Sweden)

    Zhe Dong

    2016-01-01

    Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.

  15. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  16. Model-free methods of analyzing domain motions in proteins from simulation : A comparison of normal mode analysis and molecular dynamics simulation of lysozyme

    NARCIS (Netherlands)

    Hayward, S; Kitao, A; Berendsen, HJC

    1997-01-01

    Model-free methods are introduced to determine quantities pertaining to protein domain motions from normal mode analyses and molecular dynamics simulations, For the normal mode analysis, the methods are based on the assumption that in low frequency modes, domain motions can be well approximated by m

  17. Model-based learning and the contribution of the orbitofrontal cortex to the model-free world.

    Science.gov (United States)

    McDannald, Michael A; Takahashi, Yuji K; Lopatina, Nina; Pietras, Brad W; Jones, Josh L; Schoenbaum, Geoffrey

    2012-04-01

    Learning is proposed to occur when there is a discrepancy between reward prediction and reward receipt. At least two separate systems are thought to exist: one in which predictions are proposed to be based on model-free or cached values; and another in which predictions are model-based. A basic neural circuit for model-free reinforcement learning has already been described. In the model-free circuit the ventral striatum (VS) is thought to supply a common-currency reward prediction to midbrain dopamine neurons that compute prediction errors and drive learning. In a model-based system, predictions can include more information about an expected reward, such as its sensory attributes or current, unique value. This detailed prediction allows for both behavioral flexibility and learning driven by changes in sensory features of rewards alone. Recent evidence from animal learning and human imaging suggests that, in addition to model-free information, the VS also signals model-based information. Further, there is evidence that the orbitofrontal cortex (OFC) signals model-based information. Here we review these data and suggest that the OFC provides model-based information to this traditional model-free circuitry and offer possibilities as to how this interaction might occur.

  18. Dissolution process analysis using model-free Noyes-Whitney integral equation.

    Science.gov (United States)

    Hattori, Yusuke; Haruna, Yoshimasa; Otsuka, Makoto

    2013-02-01

    Drug dissolution process of solid dosages is theoretically described by Noyes-Whitney-Nernst equation. However, the analysis of the process is demonstrated assuming some models. Normally, the model-dependent methods are idealized and require some limitations. In this study, Noyes-Whitney integral equation was proposed and applied to represent the drug dissolution profiles of a solid formulation via the non-linear least squares (NLLS) method. The integral equation is a model-free formula involving the dissolution rate constant as a parameter. In the present study, several solid formulations were prepared via changing the blending time of magnesium stearate (MgSt) with theophylline monohydrate, α-lactose monohydrate, and crystalline cellulose. The formula could excellently represent the dissolution profile, and thereby the rate constant and specific surface area could be obtained by NLLS method. Since the long time blending coated the particle surface with MgSt, it was found that the water permeation was disturbed by its layer dissociating into disintegrant particles. In the end, the solid formulations were not disintegrated; however, the specific surface area gradually increased during the process of dissolution. The X-ray CT observation supported this result and demonstrated that the rough surface was dominant as compared to dissolution, and thus, specific surface area of the solid formulation gradually increased.

  19. Model-Free Primitive-Based Iterative Learning Control Approach to Trajectory Tracking of MIMO Systems With Experimental Validation.

    Science.gov (United States)

    Radac, Mircea-Bogdan; Precup, Radu-Emil; Petriu, Emil M

    2015-11-01

    This paper proposes a novel model-free trajectory tracking of multiple-input multiple-output (MIMO) systems by the combination of iterative learning control (ILC) and primitives. The optimal trajectory tracking solution is obtained in terms of previously learned solutions to simple tasks called primitives. The library of primitives that are stored in memory consists of pairs of reference input/controlled output signals. The reference input primitives are optimized in a model-free ILC framework without using knowledge of the controlled process. The guaranteed convergence of the learning scheme is built upon a model-free virtual reference feedback tuning design of the feedback decoupling controller. Each new complex trajectory to be tracked is decomposed into the output primitives regarded as basis functions. The optimal reference input for the control system to track the desired trajectory is next recomposed from the reference input primitives. This is advantageous because the optimal reference input is computed straightforward without the need to learn from repeated executions of the tracking task. In addition, the optimization problem specific to trajectory tracking of square MIMO systems is decomposed in a set of optimization problems assigned to each separate single-input single-output control channel that ensures a convenient model-free decoupling. The new model-free primitive-based ILC approach is capable of planning, reasoning, and learning. A case study dealing with the model-free control tuning for a nonlinear aerodynamic system is included to validate the new approach. The experimental results are given.

  20. Model-free functional MRI analysis for detecting low-frequency functional connectivity in the human brain

    Science.gov (United States)

    Wismueller, Axel; Lange, Oliver; Auer, Dorothee; Leinsinger, Gerda

    2010-03-01

    Slowly varying temporally correlated activity fluctuations between functionally related brain areas have been identified by functional magnetic resonance imaging (fMRI) research in recent years. These low-frequency oscillations of less than 0.08 Hz appear to play a major role in various dynamic functional brain networks, such as the so-called 'default mode' network. They also have been observed as a property of symmetric cortices, and they are known to be present in the motor cortex among others. These low-frequency data are difficult to detect and quantify in fMRI. Traditionally, user-based regions of interests (ROI) or 'seed clusters' have been the primary analysis method. In this paper, we propose unsupervised clustering algorithms based on various distance measures to detect functional connectivity in resting state fMRI. The achieved results are evaluated quantitatively for different distance measures. The Euclidian metric implemented by standard unsupervised clustering approaches is compared with a non-metric topographic mapping of proximities based on the the mutual prediction error between pixel-specific signal dynamics time-series. It is shown that functional connectivity in the motor cortex of the human brain can be detected based on such model-free analysis methods for resting state fMRI.

  1. A Model-free Approach to Fault Detection of Continuous-time Systems Based on Time Domain Data

    Institute of Scientific and Technical Information of China (English)

    Ping Zhang; Steven X. Ding

    2007-01-01

    In this paper, a model-free approach is presented to design an observer-based fault detection system of linear continuoustime systems based on input and output data in the time domain. The core of the approach is to directly identify parameters of the observer-based residual generator based on a numerically reliable data equation obtained by filtering and sampling the input and output signals.

  2. Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code

    DEFF Research Database (Denmark)

    Zhang, Yunqian; Chen, Zhe; Cheng, Ming;

    2011-01-01

    As the wind turbine is a nonlinear high-order system, to achieve good pitch control performance, model free adaptive control (MFAC) approach which doesn't need the mathematical model of the wind turbine is adopted in the pitch control system in this paper. A pseudo gradient vector whose estimation...

  3. Model-free control

    Science.gov (United States)

    Fliess, Michel; Join, Cédric

    2013-12-01

    'Model-free control'and the corresponding 'intelligent' PID controllers (iPIDs), which already had many successful concrete applications, are presented here for the first time in an unified manner, where the new advances are taken into account. The basics of model-free control is now employing some old functional analysis and some elementary differential algebra. The estimation techniques become quite straightforward via a recent online parameter identification approach. The importance of iPIs and especially of iPs is deduced from the presence of friction. The strange industrial ubiquity of classic PIDs and the great difficulty for tuning them in complex situations is deduced, via an elementary sampling, from their connections with iPIDs. Several numerical simulations are presented which include some infinite-dimensional systems. They demonstrate not only the power of our intelligent controllers but also the great simplicity for tuning them.

  4. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  5. Model-free analysis of quadruply imaged gravitationally lensed systems and substructured galaxies

    CERN Document Server

    Woldesenbet, Addishiwot Girma

    2015-01-01

    Multiple image gravitational lens systems, and especially quads are invaluable in determining the amount and distribution of mass in galaxies. This is usually done by mass modeling using parametric or free-form methods. An alternative way of extracting information about lens mass distribution is to use lensing degeneracies and invariants. Where applicable, they allow one to make conclusions about whole classes of lenses without model fitting. Here, we use approximate, but observationally useful invariants formed by the three relative polar angles of quad images around the lens center to show that many smooth elliptical+shear lenses can reproduce the same set of quad image angles within observational error. This result allows us to show in a model-free way what the general class of smooth elliptical+shear lenses looks like in the three dimensional (3D) space of image relative angles, and that this distribution does not match that of the observed quads. We conclude that, even though smooth elliptical+shear lens...

  6. Application of the quasi-spectral density function of (15)N nuclei to the selection of a motional model for model-free analysis.

    Science.gov (United States)

    Ishima, R; Yamasaki, K; Nagayama, K

    1995-12-01

    Parameters used in model-free analysis were related to simulated spectral density functions in a frequency region experimentally obtained by quasi-spectral density function analysis of (15)N nuclei. Five kinds of motional models used in recent model-free analyses were characterized by a simple classification of the experimental spectral density function. We demonstrate advantages and limitations of each of the motional models. To verify the character of the models, model selection using experimental spectral density functions was examined.

  7. Processing speed enhances model-based over model-free reinforcement learning in the presence of high working memory functioning.

    Science.gov (United States)

    Schad, Daniel J; Jünger, Elisabeth; Sebold, Miriam; Garbusow, Maria; Bernhardt, Nadine; Javadi, Amir-Homayoun; Zimmermann, Ulrich S; Smolka, Michael N; Heinz, Andreas; Rapp, Michael A; Huys, Quentin J M

    2014-01-01

    Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed vs. habitual, or, more recently and based on statistical arguments, as model-free vs. model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation.

  8. Processing speed enhances model-based over model-free reinforcement learning in the presence of high working memory functioning

    Directory of Open Access Journals (Sweden)

    Daniel J. Schad

    2014-12-01

    Full Text Available Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed versus habitual, or, more recently and based on statistical arguments, as model-free versus model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation.

  9. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS).

  10. Nearly data-based optimal control for linear discrete model-free systems with delays via reinforcement learning

    Science.gov (United States)

    Zhang, Jilie; Zhang, Huaguang; Wang, Binrui; Cai, Tiaoyang

    2016-05-01

    In this paper, a nearly data-based optimal control scheme is proposed for linear discrete model-free systems with delays. The nearly optimal control can be obtained using only measured input/output data from systems, by reinforcement learning technology, which combines Q-learning with value iterative algorithm. First, we construct a state estimator by using the measured input/output data. Second, the quadratic functional is used to approximate the value function at each point in the state space, and the data-based control is designed by Q-learning method using the obtained state estimator. Then, the paper states the method, that is, how to solve the optimal inner kernel matrix ? in the least-square sense, by value iteration algorithm. Finally, the numerical examples are given to illustrate the effectiveness of our approach.

  11. Vision-Based Autonomous Underwater Vehicle Navigation in Poor Visibility Conditions Using a Model-Free Robust Control

    Directory of Open Access Journals (Sweden)

    Ricardo Pérez-Alcocer

    2016-01-01

    Full Text Available This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem, we have adapted the lαβ color space to identify features of interest in underwater images even in extreme visibility conditions. To guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our visual navigation system in real environments showing the feasibility of our approach.

  12. Lazy-Learning-Based Data-Driven Model-Free Adaptive Predictive Control for a Class of Discrete-Time Nonlinear Systems.

    Science.gov (United States)

    Hou, Zhongsheng; Liu, Shida; Tian, Taotao

    2016-05-18

    In this paper, a novel data-driven model-free adaptive predictive control method based on lazy learning technique is proposed for a class of discrete-time single-input and single-output nonlinear systems. The feature of the proposed approach is that the controller is designed only using the input-output (I/O) measurement data of the system by means of a novel dynamic linearization technique with a new concept termed pseudogradient (PG). Moreover, the predictive function is implemented in the controller using a lazy-learning (LL)-based PG predictive algorithm, such that the controller not only shows good robustness but also can realize the effect of model-free adaptive prediction for the sudden change of the desired signal. Further, since the LL technique has the characteristic of database queries, both the online and offline I/O measurement data are fully and simultaneously utilized to real-time adjust the controller parameters during the control process. Moreover, the stability of the proposed method is guaranteed by rigorous mathematical analysis. Meanwhile, the numerical simulations and the laboratory experiments implemented on a practical three-tank water level control system both verify the effectiveness of the proposed approach.

  13. A Safe Interaction of Robot Assisted Rehabilitation, Based on Model-Free Impedance Control with Singularity Avoidance

    Directory of Open Access Journals (Sweden)

    Iman Sharifi

    2014-12-01

    Full Text Available In this paper, a singularity-free control methodology for the safe robot-human interaction is proposed using a hybrid control technique in robotic rehabilitation applications. With the use of max-plus algebra, a hybrid controller is designed to guarantee feasible robot motion in the vicinity of the kinematic singularities or going through and staying at the singular configuration. The approach taken in this paper is based on model-free impedance control and hence does not require any information about the model except the upper bounds on the system matrix. The stability of the approach is investigated using multiple Lyapunov function theory. The proposed control algorithm is applied to PUMA 560 robot arm, a six-axis industrial robot. The results demonstrate the validity of the proposed control scheme.

  14. Language acquisition is model-based rather than model-free.

    Science.gov (United States)

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  15. Barrier Lyapunov function-based model-free constraint position control for mechanical systems

    Energy Technology Data Exchange (ETDEWEB)

    Han, Seong Ik; Ha, Hyun Uk; Lee, Jang Myung [Pusan National University, Busan (Korea, Republic of)

    2016-07-15

    In this article, a motion constraint control scheme is presented for mechanical systems without a modeling process by introducing a barrier Lyapunov function technique and adaptive estimation laws. The transformed error and filtered error surfaces are defined to constrain the motion tracking error in the prescribed boundary layers. Unknown parameters of mechanical systems are estimated using adaptive laws derived from the Lyapunov function. Then, robust control used the conventional sliding mode control, which give rise to excessive chattering, is changed to finite time-based control to alleviate undesirable chattering in the control action and to ensure finite-time error convergence. Finally, the constraint controller from the barrier Lyapunov function is designed and applied to the constraint of the position tracking error of the mechanical system. Two experimental examples for the XY table and articulated manipulator are shown to evaluate the proposed control scheme.

  16. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  17. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Directory of Open Access Journals (Sweden)

    Markus Helmer

    Full Text Available Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain. They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even

  18. Model-Free Estimation of Tuning Curves and Their Attentional Modulation, Based on Sparse and Noisy Data.

    Science.gov (United States)

    Helmer, Markus; Kozyrev, Vladislav; Stephan, Valeska; Treue, Stefan; Geisel, Theo; Battaglia, Demian

    2016-01-01

    Tuning curves are the functions that relate the responses of sensory neurons to various values within one continuous stimulus dimension (such as the orientation of a bar in the visual domain or the frequency of a tone in the auditory domain). They are commonly determined by fitting a model e.g. a Gaussian or other bell-shaped curves to the measured responses to a small subset of discrete stimuli in the relevant dimension. However, as neuronal responses are irregular and experimental measurements noisy, it is often difficult to determine reliably the appropriate model from the data. We illustrate this general problem by fitting diverse models to representative recordings from area MT in rhesus monkey visual cortex during multiple attentional tasks involving complex composite stimuli. We find that all models can be well-fitted, that the best model generally varies between neurons and that statistical comparisons between neuronal responses across different experimental conditions are affected quantitatively and qualitatively by specific model choices. As a robust alternative to an often arbitrary model selection, we introduce a model-free approach, in which features of interest are extracted directly from the measured response data without the need of fitting any model. In our attentional datasets, we demonstrate that data-driven methods provide descriptions of tuning curve features such as preferred stimulus direction or attentional gain modulations which are in agreement with fit-based approaches when a good fit exists. Furthermore, these methods naturally extend to the frequent cases of uncertain model selection. We show that model-free approaches can identify attentional modulation patterns, such as general alterations of the irregular shape of tuning curves, which cannot be captured by fitting stereotyped conventional models. Finally, by comparing datasets across different conditions, we demonstrate effects of attention that are cell- and even stimulus

  19. Side Chain Conformational Distributions of a Small Protein Derived from Model-Free Analysis of a Large Set of Residual Dipolar Couplings.

    Science.gov (United States)

    Li, Fang; Grishaev, Alexander; Ying, Jinfa; Bax, Ad

    2015-11-25

    Accurate quantitative measurement of structural dispersion in proteins remains a prime challenge to both X-ray crystallography and NMR spectroscopy. Here we use a model-free approach based on measurement of many residual dipolar couplings (RDCs) in differentially orienting aqueous liquid crystalline solutions to obtain the side chain χ1 distribution sampled by each residue in solution. Applied to the small well-ordered model protein GB3, our approach reveals that the RDC data are compatible with a single narrow distribution of side chain χ1 angles for only about 40% of the residues. For more than half of the residues, populations greater than 10% for a second rotamer are observed, and four residues require sampling of three rotameric states to fit the RDC data. In virtually all cases, sampled χ1 values are found to center closely around ideal g(-), g(+) and t rotameric angles, even though no rotamer restraint is used when deriving the sampled angles. The root-mean-square difference between experimental (3)JHαHβ couplings and those predicted by the Haasnoot-parametrized, motion-adjusted Karplus equation reduces from 2.05 to 0.75 Hz when using the new rotamer analysis instead of the 1.1-Å X-ray structure as input for the dihedral angles. A comparison between observed and predicted (3)JHαHβ values suggests that the root-mean-square amplitude of χ1 angle fluctuations within a given rotamer well is ca. 20°. The quantitatively defined side chain rotamer equilibria obtained from our study set new benchmarks for evaluating improved molecular dynamics force fields, and also will enable further development of quantitative relations between side chain chemical shift and structure.

  20. Connectivity Concordance Mapping: A New Tool for Model-Free Analysis of fMRI Data of the Human Brain

    Science.gov (United States)

    Lohmann, Gabriele; Ovadia-Caro, Smadar; Jungehülsing, Gerhard Jan; Margulies, Daniel S.; Villringer, Arno; Turner, Robert

    2011-01-01

    Functional magnetic resonance data acquired in a task-absent condition (“resting state”) require new data analysis techniques that do not depend on an activation model. Here, we propose a new analysis method called Connectivity Concordance Mapping (CCM). The main idea is to assign a label to each voxel based on the reproducibility of its whole-brain pattern of connectivity. Specifically, we compute the correlations of time courses of each voxel with every other voxel for each measurement. Voxels whose correlation pattern is consistent across measurements receive high values. The result of a CCM analysis is thus a voxel-wise map of concordance values. Regions of high inter-subject concordance can be assumed to be functionally consistent, and may thus be of specific interest for further analysis. Here we present two fMRI studies to demonstrate the possible applications of the algorithm. The first is a eyes-open/eyes-closed paradigm designed to highlight the potential of the method in a relatively simple domain. The second study is a longitudinal repeated measurement of a patient following stroke. Longitudinal clinical studies such as this may represent the most interesting domain of applications for this algorithm. PMID:22470320

  1. Transition from 'model-based' to 'model-free' behavioral control in addiction: Involvement of the orbitofrontal cortex and dorsolateral striatum.

    Science.gov (United States)

    Lucantonio, Federica; Caprioli, Daniele; Schoenbaum, Geoffrey

    2014-01-01

    Cocaine addiction is a complex and multidimensional process involving a number of behavioral and neural forms of plasticity. The behavioral transition from voluntary drug use to compulsive drug taking may be explained at the neural level by drug-induced changes in function or interaction between a flexible planning system, associated with prefrontal cortical regions, and a rigid habit system, associated with the striatum. The dichotomy between these two systems is operationalized in computational theory by positing model-based and model-free learning mechanisms, the former relying on an "internal model" of the environment and the latter on pre-computed or cached values to control behavior. In this review, we will suggest that model-free and model-based learning mechanisms appear to be differentially affected, at least in the case of psychostimulants such as cocaine, with the former being enhanced while the latter are disrupted. As a result, the behavior of long-term drug users becomes less flexible and responsive to the desirability of expected outcomes and more habitual, based on the long history of reinforcement. To support our specific proposal, we will review recent neural and behavioral evidence on the effect of psychostimulant exposure on orbitofrontal and dorsolateral striatum structure and function. This article is part of a Special Issue entitled 'NIDA 40th Anniversary Issue'.

  2. 一种基于VRFT的全格式无模型自适应控制%A Full-format Model Free Adaptive Control Based on VRFT

    Institute of Scientific and Technical Information of China (English)

    孙淑杰; 苏成利; 侯立刚

    2013-01-01

    针对无模型自适应控制中的时变参数拟梯度向量估计值信息没得到充分利用的问题,提出一种基于VRFT(虚拟参考反馈调整方法)的无模型自适应控制方法.该方法利用VRFT中的参考模型思想,首先将非线性系统经全格式线性化后的线性模型与无模型自适应控制器构成的闭环传递函数作为参考模型,再以参考模型的输出与系统期望输出的误差作为控制器的输入,从而将拟梯度向量在过去时刻的估计值引入到无模型自适应控制设计中,提高了伪偏导数估计值信息利用率.同时消去控制律中的权重因子,避免因人为赋值而影响控制效果.%Considering the fact that in model free adaptive control(MFAC),the estimated value mtormation in the proposed time-varying parameter' s gradient vector can' t be fully used,a model free adaptive control method based on virtual reference feedback tuning(VRFT) was proposed.According to the theory of VRFT,both fully-formatted linear model of the nonlinear system and the closed-loop transfer function constituted by model free adaptive control were taken as the reference model;and the error between reference model output and system' s desired output was taken as the control input,in this way,the estimated value of gradient vector in the past time was introduced to the model free adaptive control to have utilization rate of the estimated value of pseudo partial derivative improved,meanwhile,the weighting factor of the control law can be eliminated so that the influence on control effect from artificial assignment can be avoided.

  3. Systematic errors in detecting biased agonism: Analysis of current methods and development of a new model-free approach

    Science.gov (United States)

    Onaran, H. Ongun; Ambrosio, Caterina; Uğur, Özlem; Madaras Koncz, Erzsebet; Grò, Maria Cristina; Vezzi, Vanessa; Rajagopal, Sudarshan; Costa, Tommaso

    2017-01-01

    Discovering biased agonists requires a method that can reliably distinguish the bias in signalling due to unbalanced activation of diverse transduction proteins from that of differential amplification inherent to the system being studied, which invariably results from the non-linear nature of biological signalling networks and their measurement. We have systematically compared the performance of seven methods of bias diagnostics, all of which are based on the analysis of concentration-response curves of ligands according to classical receptor theory. We computed bias factors for a number of β-adrenergic agonists by comparing BRET assays of receptor-transducer interactions with Gs, Gi and arrestin. Using the same ligands, we also compared responses at signalling steps originated from the same receptor-transducer interaction, among which no biased efficacy is theoretically possible. In either case, we found a high level of false positive results and a general lack of correlation among methods. Altogether this analysis shows that all tested methods, including some of the most widely used in the literature, fail to distinguish true ligand bias from “system bias” with confidence. We also propose two novel semi quantitative methods of bias diagnostics that appear to be more robust and reliable than currently available strategies. PMID:28290478

  4. Model-free optimal controller design for continuous-time nonlinear systems by adaptive dynamic programming based on a precompensator.

    Science.gov (United States)

    Zhang, Jilie; Zhang, Huaguang; Liu, Zhenwei; Wang, Yingchun

    2015-07-01

    In this paper, we consider the problem of developing a controller for continuous-time nonlinear systems where the equations governing the system are unknown. Using the measurements, two new online schemes are presented for synthesizing a controller without building or assuming a model for the system, by two new implementation schemes based on adaptive dynamic programming (ADP). To circumvent the requirement of the prior knowledge for systems, a precompensator is introduced to construct an augmented system. The corresponding Hamilton-Jacobi-Bellman (HJB) equation is solved by adaptive dynamic programming, which consists of the least-squared technique, neural network approximator and policy iteration (PI) algorithm. The main idea of our method is to sample the information of state, state derivative and input to update the weighs of neural network by least-squared technique. The update process is implemented in the framework of PI. In this paper, two new implementation schemes are presented. Finally, several examples are given to illustrate the effectiveness of our schemes.

  5. Whole-brain, time-locked activation with simple tasks revealed using massive averaging and model-free analysis

    Science.gov (United States)

    Gonzalez-Castillo, Javier; Saad, Ziad S.; Handwerker, Daniel A.; Inati, Souheil J.; Brenowitz, Noah; Bandettini, Peter A.

    2012-01-01

    The brain is the body's largest energy consumer, even in the absence of demanding tasks. Electrophysiologists report on-going neuronal firing during stimulation or task in regions beyond those of primary relationship to the perturbation. Although the biological origin of consciousness remains elusive, it is argued that it emerges from complex, continuous whole-brain neuronal collaboration. Despite converging evidence suggesting the whole brain is continuously working and adapting to anticipate and actuate in response to the environment, over the last 20 y, task-based functional MRI (fMRI) have emphasized a localizationist view of brain function, with fMRI showing only a handful of activated regions in response to task/stimulation. Here, we challenge that view with evidence that under optimal noise conditions, fMRI activations extend well beyond areas of primary relationship to the task; and blood-oxygen level-dependent signal changes correlated with task-timing appear in over 95% of the brain for a simple visual stimulation plus attention control task. Moreover, we show that response shape varies substantially across regions, and that whole-brain parcellations based on those differences produce distributed clusters that are anatomically and functionally meaningful, symmetrical across hemispheres, and reproducible across subjects. These findings highlight the exquisite detail lying in fMRI signals beyond what is normally examined, and emphasize both the pervasiveness of false negatives, and how the sparseness of fMRI maps is not a result of localized brain function, but a consequence of high noise and overly strict predictive response models. PMID:22431587

  6. PID design based on Maclaurin expansion and its model-free auto-tuning%基于Maclaurin展开的PID设计与无模型自整定

    Institute of Scientific and Technical Information of China (English)

    杨启文; 阳外玲; 薛云灿; 余福祥; 杨远慧

    2011-01-01

    针对自衡对象,提出一种基于期望模型的PID自整定方法,该方法无需被控对象的数学模型.利用Maclaurin腱开技术,给出了PID控制器的整定公式:并通过开环阶跃响应,实现了PID控制器的无模型白整定.仿真结果表明,利用该自整定方法所得的PID能有效地提高高阶被控对象的系统性能;即使在噪声环境下,该方法仍具有很好的鲁棒性.%A method of auto-tuning for PID controller based on the desired model is presented for the stable plant in this paper. No model of controlled plants is needed by using the proposed method. The tuning of PID controller is formulated by using the Maclaurin expansion. The model-free auto-tuning of PID controller is implemented during the open loop step response. Simulation results show that the resulting PID controller is capable of enhancing the control performance for high-order plant effectively, and the proposed method has a strong robustness even under noise condition.

  7. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework.

  8. Model-free stabilization by extremum seeking

    CERN Document Server

    Scheinker, Alexander

    2017-01-01

    With this brief, the authors present algorithms for model-free stabilization of unstable dynamic systems. An extremum-seeking algorithm assigns the role of a cost function to the dynamic system’s control Lyapunov function (clf) aiming at its minimization. The minimization of the clf drives the clf to zero and achieves asymptotic stabilization. This approach does not rely on, or require knowledge of, the system model. Instead, it employs periodic perturbation signals, along with the clf. The same effect is achieved as by using clf-based feedback laws that profit from modeling knowledge, but in a time-average sense. Rather than use integrals of the systems vector field, we employ Lie-bracket-based (i.e., derivative-based) averaging. The brief contains numerous examples and applications, including examples with unknown control directions and experiments with charged particle accelerators. It is intended for theoretical control engineers and mathematicians, and practitioners working in various industrial areas ...

  9. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)

    2014-05-15

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  10. MODEL-FREE ADAPTIVE CONTROL OF NONLINEAR SYSTEM BASED ON MLLER APPROACH%基于Müller法的非线性系统不依赖于受控系统模型的自适应控制

    Institute of Scientific and Technical Information of China (English)

    胡致强

    2001-01-01

    Presents the model-free adaptive control algorithm established for a class of non-linear systems by approaching the non-linear discrete-time systems based on quadratic interpolation polynomial dynamic nonlinearization using Müller, discusses the convergence of this algorithm, and concludes from simulation results that this algorithm is correct and effective for a class of nonlinear systems to achieve model-free adaptive control.%用Müller法将非线性离散系统用基于二次插值多项式进行动态非线性逼近,给出了不依赖于受控系统模型的自适应控制算法,讨论了该算法的收敛性.通过仿真表明,该算法对一类非线性系统实现无模型自适应控制是正确和有效的.

  11. Simulation Analysis of The Improved Algorithm of Model-free Adaptive Control%无模型自适应控制改进算法的性能仿真分析

    Institute of Scientific and Technical Information of China (English)

    陈琛; 何小阳

    2013-01-01

    在基于紧格式线性化方法的无模型自适应控制算法(Model-free Adaptive Control Based on Tight Format Linearization,TFL-MFAC)的基础上,针对大时间滞后的特点,提出针对大时滞对象的MFAC 改进算法(Improved MFAC on Large Time-delay System,LTDS-MFAC).构造了大时滞对象并通过MATLAB仿真实验对改进MFAC算法的鲁棒性、抗干扰能力和跟踪能力进行分析.仿真实验表明了改进MFAC算法对大时滞系统控制具有更好的控制性能.

  12. 永磁直线同步电机无模型自适应直接推力控制%Direct Trust Control of Permanent Magnet Linear Synchronous Motor Based on Model-free Adaptive Control Method

    Institute of Scientific and Technical Information of China (English)

    崔皆凡; 闫红; 单宝钰

    2013-01-01

    For the characteristics of external disturbances and uncertain variable model parameters in permanent magnet linear synchronous motor,model-free adaptive control was firstly applied into the direct trust control of permanent magnet linear synchronous motor With this method,the differential equation between the electromagnetic force and speed was fitted in real time during the operation,and the given electromagnetic force was caculated through control law function,ultimately,direct trust control of permanent magnet linear synchronous motor based on model-free adaptive control method could be achieved.Compared with PI control,model-free adaptive control could not only eliminate the need for adjusting PI controller parameters,but also reduce the speed and force ripples caused by parameter variations and load disturbance in motor control system.,the stability of control system was ensured and robustness to internal disturbances and external disturbances was improved.%针对永磁直线同步电机模型参数不确定且时变并且在运行过程中容易受外部扰动影响的特点,首次将无模型自适应控制理论应用到永磁直线同步电机直接推力控制中.采用无模型自适应方法,实时拟合出永磁直线同步电机运行过程中电磁推力和速度之间的时变差分方程,通过控制率函数推算出给定电磁推力,实现永磁直线同步电机无模型自适应直接推力控制.与传统的PI控制相比,无模型自适应直接推力控制算法既不需要调节PI参数,又明显减少了由于电机控制系统参数变化和负载扰动引起的速度和推力脉动,在保证了系统稳定运行情况下提高了电机控制系统对内部和外部干扰的鲁棒性.

  13. 冷轧机压下系统无模型控制器的设计与仿真分析%Design and Simulation Analysis of Model-free Adaptive Controller for Screw down System of Cold Strip Mill

    Institute of Scientific and Technical Information of China (English)

    王宏文; 高维国; 王艺伶; 黄巍巍

    2012-01-01

    Screw down system of cold strip mill has the characteristic of high order nonlinear, and the rolling processes are uncertain.In order to increase its screw down system position precision, its traditional fuzzy PID controller was improved.According to parameters of the spot,the mathematical model of screw down system was established, the controller was designed based on model-free adaptive control algorithm.Simulation results show that model-free adaptive controller has fast response speed, high control precision, the ability of noise suppression, its performance is better than former controller, it is valuable to use and popularize.%针对板带冷轧机压下系统高阶非线性的特点,且在轧制过程中存在不确定性,为提高压下系统的位置精度,对传统的模糊PID控制器进行了算法改进,根据现场设备参数,建立了压下系统各部件的数学模型,采用无模型自适应算法进行控制器设计.仿真结果显示,无模型控制器在此系统下响应速度快,控制精度高,具有一定的噪声抑制能力,性能优于改造前的控制器,具有一定的推广价值.

  14. Model-free kinetic analysis of Sr2FeMoO6 re-crystallization process used for double-perovskite monocrystals grown by Bridgman method

    Science.gov (United States)

    Bartha, Cristina; Plapcianu, Carmen; Palade, Petru; Vizman, Daniel

    2015-12-01

    The synthesis routes for polycrystalline bulk Sr2FeMoO6 (SFMO), offer various possibilities, but in all the cases it is difficult to obtain a single phase of this compound. A new challenge in the field is to achieve mono-crystals using different growing routes and the Bridgman method represents one of them. In order to establish the optimal conditions of mono-crystals growing process, a complex thermal investigation of bulk double perovskite has been performed. Differential thermal analysis investigation in argon inert atmosphere, starting from room temperature up to 1650°C provided information about melting and re-crystallization temperature range. Both, the activation energy of Sr2FeMoO6 re-crystallization process and the re-crystallization mechanism were comparatively analyzed by two free-model estimations (Friedman and Ozawa-Flynn-Wall analysis). The resulted data are very important in order to set up the heating program of Bridgman furnace.

  15. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    OpenAIRE

    Xuhui Bu; Fashan Yu; Zhongsheng Hou; Hongwei Zhang

    2012-01-01

    The convergence of model-free adaptive control (MFAC) algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effe...

  16. 基于偏格式线性化的MIMO系统无模型自适应控制%Model-free Adaptive Control of MIMO Systems Based on Linearization of Partial Format

    Institute of Scientific and Technical Information of China (English)

    张广辉; 苏成利; 李平

    2011-01-01

    To solve the shortcomings of traditional PID controllers in dealing with disturbance rejection and robustness, and the over-reliance of model-based control algorithms on mathematical model of the controlled system a model-free adaptive control algorithm is proposed for discrete time MIMO (multiple-input multiple-output) nonlinear systems. In tbis algorithm nonlinear system is Iinearized by linearization of partial format. Then the controlled system parameters are identified on-line by a novel projection algonthm. The parameters are used to recursively compute model-free adaptive control rule. The controller is designed only by using I/O data of the controlled system, and no structural information or extemal testing signals are needed. So the impact of the unmodelled dynamic problem of the modeling process on system performance doesn't exist. Simulation result shows that the proposed algorithm is an effective strategy with excellent tracking ability and strong robustness.%针对常规PID控制器不能很好兼顾抗干扰性与鲁棒性以及基于模型的控制算法过于依赖受控系统数学模型的缺点,提出一种适用于离散时间多输入多输出(MIMO)非线性系统的无模型自适应控制算法.该算法首先通过偏格式线性化方法将非线性系统线性化,再利用一种新型的投影算法在线辨识受控系统参数,根据辨识得到的受控系统参数直接递推计算无模型自适应控制律.控制器的设计仅利用受控系统的I/O数据,不需要受控系统的结构信息和外部实验信号,避免了由建模过程引起的未建模动态问题对控制性能的影响.仿真结果表明该算法具有良好的跟踪性能和较强的鲁棒性.

  17. Model-Free Adaptive Heating Process Control

    OpenAIRE

    Ivana LUKÁČOVÁ; Piteľ, Ján

    2009-01-01

    The aim of this paper is to analyze the dynamic behaviour of a Model-Free Adaptive (MFA) heating process control. The MFA controller is designed as three layer neural network with proportional element. The method of backward propagation of errors was used for neural network training. Visualization and training of the artificial neural network was executed by Netlab in Matlab environment. Simulation of the MFA heating process control with outdoor temperature compensation has proved better resu...

  18. 基于无模型自适应控制器的风力发电机载荷控制%Load Control of Wind Turbine Based on Model-free Adaptive Controller

    Institute of Scientific and Technical Information of China (English)

    鲁效平; 李伟; 林勇刚

    2011-01-01

    建立了风力发电机独立变桨距系统的线性化模型.为了便于控制器设计,通过Park坐标变换将叶根的弯矩转换到两个固定的直交坐标系,对两个坐标系分别设计无模型自适应控制器进行载荷控制.在NREL建立的风力发电机模型FAST上对设计好的控制器进行了验证,结果表明基于MFA的独立变桨距控制器可以有效消除风力机上的不平衡载荷.与统一变桨距控制相比,叶轮俯仰弯矩最大峰值减小62.5%,偏转弯矩最大峰值减小60.1%,独立变桨距技术的采用对功率的输出影响不大.%A linear model for individual pitch of wind turbine system was established, and the loads of blade roots were transferred into a mean value and variations on two orthogonal axis using Park's transformation. The model-free adaptive controllers ( MFA ) were designed to control the loads components of the two orthogonal axes. The wind turbine nonlinear model FAST built by NREL was used to verify the controller. Test results showed that individual pitch based on MFA could eliminate the imbalance loads effectively. Compared to collective pitch control, the for-aft torque was decreased by 62.5%, yaw torque was decreased by 60. 1% , and the value of output power changed a little.

  19. Dynamic analysis of parallel mechanism and its model-free intelligent control%并联机构的动力学分析及其无模型智能控制

    Institute of Scientific and Technical Information of China (English)

    高国琴; 宋庆; 夏文娟

    2011-01-01

    It illustrates an application of intelligent control for a 2-DOF parallel robot in order to solve the problems of Parallel robot's high nonlinear ity, strong coupling and mathematical model complex, specific to the parallel robot mechanism with AC servo-motor drive-GPM-200 parallel mechanism.Under the conditions of model free without using the forward kinematics of the manipulatory three dimensional fuzzy PID controller is designed,which performance is compared with a linear PID controller.The dynamic simulation results based on the MATLAB demonstrate that fuzzy PID controller is shown to have better per-formance in tracking and higher robustness than the linear controller and can meet the requirement of par-allel robot for high precision and real time control.%为解决并联机器人高度非线性、强耦合、数学模型复杂的问题,针对交流伺服电机驱动的并联机器人,在模型不确定情况下提出了一种针对2-DOF并联机器人的智能控制方法,设计了一个二输入的模糊控制器,该控制方法不需要前向运动学的求解.通过模糊控制器对PID参数进行实时整定,基于MATLAB进行动态仿真,仿真结果表明:模糊PID在轨迹跟综和带负荷运动的稳定性方面比线性PID具有更好的控制效果,可实现并联机器人的高精度实时控制.

  20. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    Directory of Open Access Journals (Sweden)

    Xuhui Bu

    2012-01-01

    Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.

  1. 基于二阶泛模型的无模型自适应控制及参数整定%Model free adaptive control and parameter tuning based on second order universal model

    Institute of Scientific and Technical Information of China (English)

    王晶; 纪超; 曹柳林; 靳其兵

    2012-01-01

    首先充分利用无模型自适应控制(MFAC)边建模、边控制的特点,推导基于二阶“泛模型”的改进无模型自适应控制方法,并推导伪偏导数及控制律的迭代公式,与基于一阶泛模型的MFAC方法相比,改进策略可以使每次迭代的泛模型更加准确,从而进一步提高控制精度.接着,针对改进MFAC的参数整定问题,提出基于优化技术的控制器参数整定方法,运用辨识出的近似模型针对不同的目标函数进行优化,使得其实用范围更加广泛.通过大量仿真实验对比可以看出:经过Jeu-tr型性能指标进行参数优化的改进MFAC控制器动态响应最好,且优化迭代次数较少.因此,控制效果得到显著改善.%An improved model free adaptive control (MFAC) based on second-order universal model was derived, which can greatly improve the model and control precision. The control law and pseudo-partial-derivative were iteratively derived. For the parameter tuning of improved MFAC, a parameter optimization algorithm was presented. Using the approximate identified model, the optimal controller parameters were obtained for several different objective functions, which had wide scope of application. The Jeu-tr performance index makes the system possess better dynamic response, and less iteration times. The simulation results show the effectiveness of improved MFAC control strategy and parameter tuning method.

  2. Research on Automatic Train Operation Based on Model-free Adaptive Control%基于无模型自适应控制的城轨列车自动驾驶研究

    Institute of Scientific and Technical Information of China (English)

    石卫师

    2016-01-01

    The precise tracking of ATO (Automatic Train Operation)target speed curves in urban rail transit is the key for the safety,efficiency,passenger comfort and energy efficiency of trains.As the ATO system has the characteristics of hard modeling and high robustness requirement as a nonlinear,time variant,state-delayed and complex system,this paper introduced the method of model-free adaptive control into the design of ATO target speed curves tracking controller.By comparison with PID algorithm,the tracking control algorithm of ATO target speed curves using model-free adaptive control demonstrated good tracking effect,little speed er-ror,high stopping precision,high comfort and less energy consumption.%城轨列车 ATO(Automatic Train Operation)目标速度曲线的精确追踪是保障城轨列车安全、高效、舒适和节能的关键环节。针对列车 ATO 系统为非线性时变滞后复杂系统,具有建模难和鲁棒性要求高等特点,本文将无模型自适应控制方法引入 ATO 目标速度曲线追踪控制器的设计问题中。通过与 PID 控制方法对比,基于MFAC(Model Free Adaptive Control)的 ATO 目标速度曲线追踪控制算法,具有追踪效果好、速度误差小、停车精度高、舒适度高、能耗少等特点。

  3. 基于无模型动态矩阵的下肢假肢运动控制方法%Lower limb prosthesis movement control method base on model-free dynamic matrix

    Institute of Scientific and Technical Information of China (English)

    苟斌; 刘作军; 赵丽娜; 杨鹏

    2015-01-01

    The research was based on the accurate pre-recognition of walking,upstairs,downstairs for the needs of different control modes of three typical terrains.Firstly,the method,making the corre-sponding motion control strategy of prosthetic knee joint according to the motion characteristics of three typical terrains was proposed.The movement characteristics of three typical terrains were out-standing.And the problem which was unable to obtain the equations of motion due to difficult model-ing was solved at the same time.In addition,model-free dynamic matrix was used to track control of the motion trajectory of prosthetic knee.The method using the linearization of tight format simplifies the complex relationship between the changes of hip and knee joint angle.The experimental results show that the method of motion control strategy to imitate the actual motion trajectory is effective. The limb motion can be more natural to follow the healthy limb.It makes full use of the input and output information of the controlled system.And the motion control method has a good tracking effect and convergence.It ensures the control precision of the lower limbs movement.%在准确预识别平地行走,上、下楼梯三类运动模式的基础上,提出了根据各路况的运动特点制定对应假肢膝关节运动控制策略的方法。突出三类路况运动特点的同时,解决了由于建模困难无法得到膝关节运动方程的问题,另外采用无模型动态矩阵对假肢膝关节的运动轨迹追踪控制。该方法利用紧格式线性化方法简化了人体运动时髋、膝关节角度变化的复杂关系。实验结果表明运动控制策略模仿实际运动轨迹的方法是有效的,使假肢能够自然地跟随健肢的运动变化,充分利用了被控系统输入、输出数据的内部信息,且运动控制方法具有较好的跟踪效果,确保了下肢假肢运动的控制精度。

  4. 一种基于虚拟参考反馈调整的无模型自适应控制%A Model Free Adaptive Control Based on VRFT

    Institute of Scientific and Technical Information of China (English)

    孙淑杰; 侯立刚; 苏成利; 徐利军

    2012-01-01

    A new model free adaptive control algorithm is proposed for the problem that the information of pseudo partial derivative of estimated value is not fully used. According to the idea of the virtual reference feedback tuning (VRFT), a closed-loop system is taken as reference model. The closed-loop system is composed of the linear model which is linearized in tight format and the basic model free adaptive control. Then the error caused by the output of the reference model and the system expected output is taken as the control input. As a result, the estimated value of pseudo partial derivative in the past time is introduced to the new control law. The information utilization rate of the estimated value of pseudo partial derivative is improved. The simulation result shows that the improved model free adaptive control has better control performance.%针对无模型自适应控制算法中对伪偏导数估计值信息利用率不高的问题,提出一种新的无模型自适应控制算法.运用虚拟参考反馈调整方法思想,将紧格式线性化后的线性模型与基本无模型自适应控制器构成的闭环系统作为参考模型,再以参考模型的输出与系统期望输出的误差作为控制器的输入,从而将伪偏导数过去时刻的估计值引入到新的控制律中,提高了伪偏导数估计值信息利用率.仿真结果表明改进的无模型自适应控制具有较好的控制性能.

  5. 基于递推最小二乘法的无模型自适应厚度控制%Model-free adaptive thickness control based on recursive least squares

    Institute of Scientific and Technical Information of China (English)

    王彬彬; 张飞; 王京

    2015-01-01

    针对钛合金板轧制时间短而导致的厚度控制精度较低的问题,引入无模型自适应厚度控制方法,其不依赖于被控对象的数学模型,仅利用被控对象的输入输出数据便可对被控对象进行控制。泛模型在无模型自适应控制算法中起着十分重要的作用,其中的伪偏导数φc ( k)对算法的收敛速度和跟踪能力有很大影响。利用递推最小二乘法实时辨识φc(k),确保了对时变参数的快速跟踪和小的参数估计误差。仿真结果表明:无模型自适应控制的控制效果明显优于传统的PI控制。%The model-free adaptive control algorithm is introduced to solve the problem of low thick-ness control precision for titanium-alloy plates caused by short rolling time. It is not dependent on the mathematical model of the controlled object and controls the controlled object only by using input and output data of the controlled system. The pan model plays an important role in the model-free control algorithm. The pseudo partial derivative φc ( k) has great influence on the convergence speed and the tracking capability of the algorithm. In this paper,the pseudo partial derivativeφc ( k) in the pan mod-el is identified by using the recursive least squares method,which can ensure fast tracking capability for time-varying parameters and small parameter estimation error. The simulation results show that the control effect of the model-free adaptive control algorithm is better than that of the traditional PI con-trol algorithm.

  6. Model-Free CUSUM Methods for Person Fit

    Science.gov (United States)

    Armstrong, Ronald D.; Shi, Min

    2009-01-01

    This article demonstrates the use of a new class of model-free cumulative sum (CUSUM) statistics to detect person fit given the responses to a linear test. The fundamental statistic being accumulated is the likelihood ratio of two probabilities. The detection performance of this CUSUM scheme is compared to other model-free person-fit statistics…

  7. Model-Free Adaptive Sensing and Control for a Piezoelectrically Actuated System

    OpenAIRE

    Jin-Wei Liang; Hung-Yi Chen

    2010-01-01

    Since the piezoelectrically actuated system has nonlinear and time-varying behavior, it is difficult to establish an accurate dynamic model for a model-based sensing and control design. Here, a model-free adaptive sliding controller is proposed to improve the small travel and hysteresis defects of piezoelectrically actuated systems. This sensing and control strategy employs the functional approximation technique (FAT) to establish the unknown function for eliminating the model-based requireme...

  8. Simulation Study of Model-free Adaptive Controller Based on Grey Prediction in Tension Control of Aluminum Hot Tandem Rolling%基于灰色预测的MFAC在铝热连轧张力控制中的仿真研究

    Institute of Scientific and Technical Information of China (English)

    赵新秋; 杨鹤; 杨景明; 吕金

    2014-01-01

    Aimed at tension fluctuation between the standers of aluminum strip hot tandem rolling mill in an aluminum processing factory of Chongqing, analyses were made on the factors causing tension variation. In order to increase the control accuracy of tension system, algorithm was improved for the traditional PID controller, that is, the controller was designed based on model-free adaptive control algorithm, to which grey prediction model was introduced to compensate the time-delay of system and time-varying of parameter by prediction and form new control law acting on the controlled object. Simulation results show that the model-free adaptive control algorithm based on grey prediction has a lot of advantages in term of traceability, adaptability, ability to overcome large delay, compared with traditional PID control.%针对重庆某铝加工厂双机架铝带热连轧机机架间张力波动问题,分析了影响张力波动的原因。为提高张力系统的控制精度,对传统PID控制器进行了算法改进,采用无模型自适应算法进行控制器设计,并加入灰色预测模型,通过超前预测来补偿系统延时及参数时变,形成新的控制律作用在被控对象上。仿真结果显示,基于灰色预测的无模型自适应控制在跟踪性、适应性、克服大时滞能力方面与传统的PID控制相比有很大的优越性。

  9. Simulation study on model-free adaptive control based on grey prediction in ball mill load control%基于灰色预测的无模型控制在球磨机负荷控制中的仿真研究

    Institute of Scientific and Technical Information of China (English)

    程启明; 程尹曼; 汪明媚; 王映斐

    2011-01-01

    The load object of ball mill in thermal power plant has complex traits of large time-delay, slow time-var-ying, strong nonlinearity and etc. It is difficult to obtain satisfactory control performance using conventional control methods. In this paper, a model-free adaptive control method based on gray prediction is introduced for load control.This control method combines the adaptive, anti-interference features of model-free control with the time-delay prediction, overshoot suppression and rapid stabling of gray prediction model. The measured output value of load object is replaced by the predicted result of gray model, then, model-free adaptive control is used for close-loop control.Simulation results show that the proposed control method has fast response, small overshoot, good robustness and strong anti-interference ability ; it can effectively solve the problems such as large time-delay, nonlinearity and adaptability.%火电厂钢球磨煤机的负荷对象具有大滞后、慢时变、强非线性等复杂特性,采用常规控制方法难以获得满意的控制效果,本文提出了基于灰色预测的无模型自适应负荷控制方法.该方法融合了无模型控制的自适应、抗干扰特性与灰色预测模型的预测时延、抑制超调和快速稳定特性,它将灰色模型的预测结果代替负荷对象输出测量值,再进行无模型自适应闭环控制.仿真结果表明这种控制方法系统响应快、超调小、鲁棒性好、抗干扰能力强,可以有效解决大滞后、非线性及适应性等问题.

  10. 基于改进无模型算法的医用电气设备的DC-DC变换器的研究%Research of Medical Electrical Device DC-DC Converter Base on Improved Model-free Adaptive Control algorithm

    Institute of Scientific and Technical Information of China (English)

    丁海波; 赵佳洋; 王晓彤

    2014-01-01

    In this paper, based on the DC-DC converter control method of the model mismatch in the model's limitations, A method of MFAC is proposed. Model-free adaptive control (MFAC) is a kind of control er design methods which needs less knowledge of mathematical models and relevant prior knowledge, and only needs the I/O data to design the control er. A compact format MFAC linear dynamic universal model, design a model free adaptive control er for DC-DC converter. It can be used to solve the problems of model mismatch. The simulation model of DC-DC converter is established by analyzing the character of model, The MFAC can overcome the model mismatch, has bet er adaptability and robustness.%本文针对基于DC/DC变换器模型的控制方法在模型失配方面的局限性,提出了一种基于无模型自适应控制的方法。建立了紧格式MFAC动态线性化泛模型,设计了DC-DC变换器的无模型自适应控制器。控制器不需要受控系统的数学模型及相关先验知识,仅利用系统I/O数据进行控制器设计,解决了变换器模型失配的问题。进行了Simulink仿真,仿真结果表明,相比于传统的基于模型的控制方法,本文所用方法能克服模型失配,具有更好的适应性和鲁棒性。

  11. Study on the thermal decomposition of bornadiene/CO polyketone based on the model-free method%基于无模式函数的冰片二烯/CO聚酮热分解特性研究

    Institute of Scientific and Technical Information of China (English)

    杨晓琴; 郑云武; 黄元波; 沈华杰; 赵志; 郑志锋

    2016-01-01

    以源自α-蒎烯的冰片二烯和一氧化碳( CO)为原料,在乙酸钯/2,2′-联吡啶/三氟甲烷磺酸铜/硝基苯/甲苯/甲醇催化体系作用下合成冰片二烯/CO聚酮,采用红外光谱( FTIR)、凝胶渗透色谱( GPC)、元素分析对其进行结构表征,利用热重分析对其热分解特性进行研究,同时采用 Kissinger、Flynn-Wall-Ozawa ( FWO )和Kissinger-Akahira-Sunose( KAS)3种不同的热分析方法对其动力学进行研究。结果表明,冰片二烯/CO聚酮为分子质量较高且分子质量分布较窄的聚合物,易溶于弱极性溶剂;其热分解分为预热解、快速热解和残余物缓慢热解3个阶段,用Kissinger、FWO和KAS法计算的活化能( Ea )分别为189�05,197�74和198�29 kJ/mol,表明热分解非一步反应过程;计算了动力学特征参数,K=1�32×10-3 s-1,熵变ΔS=-27�26 J/(mol·K),焓变ΔH=183�43 kJ/mol,吉布斯自由能ΔG=201�87 kJ/mol,表明热分解过程是不可自发进行的。%In this paper, bornadiene/CO polyketone was synthesized by copolymerization of bornadiene which derived from the renewable, abundant α-pinene and carbon monoxide ( CO) using palladium acetate/2,2′-bipyridine/copper trifluoromethanesulfonate/nitrobenzene/toluene/methanol catalyst system. The structure characterization of the borna⁃diene/CO polyketone was observed by the Fourier transform infrared spectroscopy ( FTIR) , gel permeation chromatogra⁃phy (GPC), and elemental analysis. In addition, the thermal decomposition properties of bornadiene/CO polyketone were examined by the thermogravimetry analysis ( TGA) at different heating rates. The thermal decomposition kinetics of bornadiene/CO polyketone was analyzed using the Kissinger method, Flynn⁃Wall⁃Ozawa ( FWO) method and Kissinger⁃Akahira⁃Sunose ( KAS) method. The major results were summarized as follows:the GPC

  12. Model-free adaptive control of advanced power plants

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  13. Projectile Base Flow Analysis

    Science.gov (United States)

    2007-11-02

    S) AND ADDRESS(ES) DCW Industries, Inc. 5354 Palm Drive La Canada, CA 91011 8. PERFORMING ORGANIZATION...REPORT NUMBER DCW -38-R-05 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) U. S. Army Research Office...Turbulence Modeling for CFD, Second Edition, DCW Industries, Inc., La Cañada, CA. Wilcox, D. C. (2001), “Projectile Base Flow Analysis,” DCW

  14. Optimal model-free prediction from multivariate time series.

    Science.gov (United States)

    Runge, Jakob; Donner, Reik V; Kurths, Jürgen

    2015-05-01

    Forecasting a time series from multivariate predictors constitutes a challenging problem, especially using model-free approaches. Most techniques, such as nearest-neighbor prediction, quickly suffer from the curse of dimensionality and overfitting for more than a few predictors which has limited their application mostly to the univariate case. Therefore, selection strategies are needed that harness the available information as efficiently as possible. Since often the right combination of predictors matters, ideally all subsets of possible predictors should be tested for their predictive power, but the exponentially growing number of combinations makes such an approach computationally prohibitive. Here a prediction scheme that overcomes this strong limitation is introduced utilizing a causal preselection step which drastically reduces the number of possible predictors to the most predictive set of causal drivers making a globally optimal search scheme tractable. The information-theoretic optimality is derived and practical selection criteria are discussed. As demonstrated for multivariate nonlinear stochastic delay processes, the optimal scheme can even be less computationally expensive than commonly used suboptimal schemes like forward selection. The method suggests a general framework to apply the optimal model-free approach to select variables and subsequently fit a model to further improve a prediction or learn statistical dependencies. The performance of this framework is illustrated on a climatological index of El Niño Southern Oscillation.

  15. A Model-Free Method for Structual Change Detection Multivariate Nonlinear Time Series

    Institute of Scientific and Technical Information of China (English)

    孙青华; 张世英; 梁雄健

    2003-01-01

    In this paper, we apply the recursive genetic programming (RGP) approach to the cognition of a system, and then proceed to the detecting procedure for structural changes in the system whose components are of long memory. This approach is adaptive and model-free, which can simulate the individual activities of the system's participants, therefore, it has strong ability to recognize the operating mechanism of the system. Based on the previous cognition about the system, a testing statistic is developed for the detection of structural changes in the system. Furthermore, an example is presented to illustrate the validity and practical value of the proposed.

  16. Model-Free Trajectory Optimisation for Unmanned Aircraft Serving as Data Ferries for Widespread Sensors

    Directory of Open Access Journals (Sweden)

    Ben Pearre

    2012-10-01

    Full Text Available Given multiple widespread stationary data sources such as ground-based sensors, an unmanned aircraft can fly over the sensors and gather the data via a wireless link. Performance criteria for such a network may incorporate costs such as trajectory length for the aircraft or the energy required by the sensors for radio transmission. Planning is hampered by the complex vehicle and communication dynamics and by uncertainty in the locations of sensors, so we develop a technique based on model-free learning. We present a stochastic optimisation method that allows the data-ferrying aircraft to optimise data collection trajectories through an unknown environment in situ, obviating the need for system identification. We compare two trajectory representations, one that learns near-optimal trajectories at low data requirements but that fails at high requirements, and one that gives up some performance in exchange for a data collection guarantee. With either encoding the ferry is able to learn significantly improved trajectories compared with alternative heuristics. To demonstrate the versatility of the model-free learning approach, we also learn a policy to minimise the radio transmission energy required by the sensor nodes, allowing prolonged network lifetime.

  17. Q-DPM: An Efficient Model-Free Dynamic Power Management Technique

    CERN Document Server

    Li, Min; Yao, Richard; Yan, Xiaolang

    2011-01-01

    When applying Dynamic Power Management (DPM) technique to pervasively deployed embedded systems, the technique needs to be very efficient so that it is feasible to implement the technique on low end processor and tight-budget memory. Furthermore, it should have the capability to track time varying behavior rapidly because the time varying is an inherent characteristic of real world system. Existing methods, which are usually model-based, may not satisfy the aforementioned requirements. In this paper, we propose a model-free DPM technique based on Q-Learning. Q-DPM is much more efficient because it removes the overhead of parameter estimator and mode-switch controller. Furthermore, its policy optimization is performed via consecutive online trialing, which also leads to very rapid response to time varying behavior.

  18. Model-free Estimation of Recent Genetic Relatedness

    Science.gov (United States)

    Conomos, Matthew P.; Reiner, Alexander P.; Weir, Bruce S.; Thornton, Timothy A.

    2016-01-01

    Genealogical inference from genetic data is essential for a variety of applications in human genetics. In genome-wide and sequencing association studies, for example, accurate inference on both recent genetic relatedness, such as family structure, and more distant genetic relatedness, such as population structure, is necessary for protection against spurious associations. Distinguishing familial relatedness from population structure with genotype data, however, is difficult because both manifest as genetic similarity through the sharing of alleles. Existing approaches for inference on recent genetic relatedness have limitations in the presence of population structure, where they either (1) make strong and simplifying assumptions about population structure, which are often untenable, or (2) require correct specification of and appropriate reference population panels for the ancestries in the sample, which might be unknown or not well defined. Here, we propose PC-Relate, a model-free approach for estimating commonly used measures of recent genetic relatedness, such as kinship coefficients and IBD sharing probabilities, in the presence of unspecified structure. PC-Relate uses principal components calculated from genome-screen data to partition genetic correlations among sampled individuals due to the sharing of recent ancestors and more distant common ancestry into two separate components, without requiring specification of the ancestral populations or reference population panels. In simulation studies with population structure, including admixture, we demonstrate that PC-Relate provides accurate estimates of genetic relatedness and improved relationship classification over widely used approaches. We further demonstrate the utility of PC-Relate in applications to three ancestrally diverse samples that vary in both size and genealogical complexity. PMID:26748516

  19. Optimisation of NMR dynamic models I. Minimisation algorithms and their performance within the model-free and Brownian rotational diffusion spaces.

    Science.gov (United States)

    d'Auvergne, Edward J; Gooley, Paul R

    2008-02-01

    The key to obtaining the model-free description of the dynamics of a macromolecule is the optimisation of the model-free and Brownian rotational diffusion parameters using the collected R (1), R (2) and steady-state NOE relaxation data. The problem of optimising the chi-squared value is often assumed to be trivial, however, the long chain of dependencies required for its calculation complicates the model-free chi-squared space. Convolutions are induced by the Lorentzian form of the spectral density functions, the linear recombinations of certain spectral density values to obtain the relaxation rates, the calculation of the NOE using the ratio of two of these rates, and finally the quadratic form of the chi-squared equation itself. Two major topological features of the model-free space complicate optimisation. The first is a long, shallow valley which commences at infinite correlation times and gradually approaches the minimum. The most severe convolution occurs for motions on two timescales in which the minimum is often located at the end of a long, deep, curved tunnel or multidimensional valley through the space. A large number of optimisation algorithms will be investigated and their performance compared to determine which techniques are suitable for use in model-free analysis. Local optimisation algorithms will be shown to be sufficient for minimisation not only within the model-free space but also for the minimisation of the Brownian rotational diffusion tensor. In addition the performance of the programs Modelfree and Dasha are investigated. A number of model-free optimisation failures were identified: the inability to slide along the limits, the singular matrix failure of the Levenberg-Marquardt minimisation algorithm, the low precision of both programs, and a bug in Modelfree. Significantly, the singular matrix failure of the Levenberg-Marquardt algorithm occurs when internal correlation times are undefined and is greatly amplified in model-free analysis by

  20. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    -and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  1. 基于改进无模型自适应控制算法的发电机广域阻尼控制器设计%Wide area power system stabilizer design based on improved model free adaptive control

    Institute of Scientific and Technical Information of China (English)

    赵艺; 陆超; 韩英铎; 门琨; 涂亮

    2013-01-01

    针对固定参数的发电机广域阻尼控制器(wide area power system stabilizer,WAPSS)不能适应电网结构和运行方式变化的问题,提出一种基于无模型自适应控制算法(model free adaptive control,MFAC)的WAPSS设计方法以实现控制器的在线自适应调整.通过分析现有MFAC控制策略,改进MFAC控制算法以适用于电力系统的WAPSS,并保证了被控系统的稳定性;结合固定参数的WAPSS设计,给出改进的MFAC-WAPSS参数设置方法.在四机两区系统和8机36节点系统测试该MFAC-WAPSS的控制效果.仿真结果表明:该MFAC-WAPSS能适应系统变化,有效抑制故障后区域间低频振荡.

  2. Thermal characterization and model free kinetics of aged epoxies and foams using TGA and DSC methods.

    Energy Technology Data Exchange (ETDEWEB)

    Cordaro, Joseph Gabriel; Kruizenga, Alan Michael; Nissen, April

    2013-10-01

    Two classes of materials, poly(methylene diphenyl diisocyanate) or PMDI foam, and cross-linked epoxy resins, were characterized using thermal gravimetric analysis (TGA) and differential scanning calorimetry (DSC), to help understand the effects of aging and %E2%80%9Cbake-out%E2%80%9D. The materials were evaluated for mass loss and the onset of decomposition. In some experiments, volatile materials released during heating were analyzed via mass spectroscopy. In all, over twenty materials were evaluated to compare the mass loss and onset temperature for decomposition. Model free kinetic (MFK) measurements, acquired using variable heating rate TGA experiments, were used to calculate the apparent activation energy of thermal decomposition. From these compiled data the effects of aging, bake-out, and sample history on the thermal stability of materials were compared. No significant differences between aged and unaged materials were detected. Bake-out did slightly affect the onset temperature of decomposition but only at the highest bake-out temperatures. Finally, some recommendations for future handling are made.

  3. Model-Free Stochastic Localization of CBRN Releases

    Science.gov (United States)

    2013-01-01

    structures to create complex phenomena (micro-climate effects, urban canyons , etc.). All this complexity can not be captured analytically and, coupled with...the fluid flow solution, QUIC simu- lates the travel of CBRN particulates via a Lagrangian random walk . Previously, the QUIC codes have been tested and...Lagrangian ran- dom walk based dispersion models [21], such as the dispersion modeling used by QUIC. One should expect to see our localization

  4. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  5. Model-free Adaptive Control for Spacecraft Attitude

    Institute of Scientific and Technical Information of China (English)

    Ran Xie; Ting Song; Peng Shi; Yushan Zhao

    2016-01-01

    A model⁃free adaptive control method is proposed for the spacecrafts whose dynamical parameters change over time and cannot be acquired accurately. The algorithm is based on full form dynamic linearization. A dimension reduction matrix is introduced to construct an augmented system with the same dimension input and output. The design of the controller depends on the system input and output data rather than the knowledge of the controlled plant. The numerical simulation results show that the improved controller can deal with different models with the same set of controller parameters, and the controller performance is better than that of PD controller for the time⁃varying system with disturbance.

  6. A Model-Free Scheme for Meme Ranking in Social Media.

    Science.gov (United States)

    He, Saike; Zheng, Xiaolong; Zeng, Daniel

    2016-01-01

    The prevalence of social media has greatly catalyzed the dissemination and proliferation of online memes (e.g., ideas, topics, melodies, tags, etc.). However, this information abundance is exceeding the capability of online users to consume it. Ranking memes based on their popularities could promote online advertisement and content distribution. Despite such importance, few existing work can solve this problem well. They are either daunted by unpractical assumptions or incapability of characterizing dynamic information. As such, in this paper, we elaborate a model-free scheme to rank online memes in the context of social media. This scheme is capable to characterize the nonlinear interactions of online users, which mark the process of meme diffusion. Empirical studies on two large-scale, real-world datasets (one in English and one in Chinese) demonstrate the effectiveness and robustness of the proposed scheme. In addition, due to its fine-grained modeling of user dynamics, this ranking scheme can also be utilized to explain meme popularity through the lens of social influence.

  7. Model-Free Adaptive Fuzzy Sliding Mode Controller Optimized by Particle Swarm for Robot Manipulator

    Directory of Open Access Journals (Sweden)

    Amin Jalali

    2013-05-01

    Full Text Available The main purpose of this paper is to design a suitable control scheme that confronts the uncertainties in a robot. Sliding mode controller (SMC is one of the most important and powerful nonlinear robust controllers which has been applied to many non-linear systems. However, this controller has some intrinsic drawbacks, namely, the chattering phenomenon, equivalent dynamic formulation, and sensitivity to the noise. This paper focuses on applying artificial intelligence integrated with the sliding mode control theory. Proposed adaptive fuzzy sliding mode controller optimized by Particle swarm algorithm (AFSMC-PSO is a Mamdani’s error based fuzzy logic controller (FLS with 7 rules integrated with sliding mode framework to provide the adaptation in order to eliminate the high frequency oscillation (chattering and adjust the linear sliding surface slope in presence of many different disturbances and the best coefficients for the sliding surface were found by offline tuning Particle Swarm Optimization (PSO. Utilizing another fuzzy logic controller as an impressive manner to replace it with the equivalent dynamic part is the main goal to make the model free controller which compensate the unknown system dynamics parameters and obtain the desired control performance without exact information about the mathematical formulation of model.

  8. To control false positives in gene-gene interaction analysis: two novel conditional entropy-based approaches.

    Directory of Open Access Journals (Sweden)

    Xiaoyu Zuo

    Full Text Available Genome-wide analysis of gene-gene interactions has been recognized as a powerful avenue to identify the missing genetic components that can not be detected by using current single-point association analysis. Recently, several model-free methods (e.g. the commonly used information based metrics and several logistic regression-based metrics were developed for detecting non-linear dependence between genetic loci, but they are potentially at the risk of inflated false positive error, in particular when the main effects at one or both loci are salient. In this study, we proposed two conditional entropy-based metrics to challenge this limitation. Extensive simulations demonstrated that the two proposed metrics, provided the disease is rare, could maintain consistently correct false positive rate. In the scenarios for a common disease, our proposed metrics achieved better or comparable control of false positive error, compared to four previously proposed model-free metrics. In terms of power, our methods outperformed several competing metrics in a range of common disease models. Furthermore, in real data analyses, both metrics succeeded in detecting interactions and were competitive with the originally reported results or the logistic regression approaches. In conclusion, the proposed conditional entropy-based metrics are promising as alternatives to current model-based approaches for detecting genuine epistatic effects.

  9. Video-Based Motion Analysis

    Science.gov (United States)

    French, Paul; Peterson, Joel; Arrighi, Julie

    2005-04-01

    Video-based motion analysis has recently become very popular in introductory physics classes. This paper outlines general recommendations regarding equipment and software; videography issues such as scaling, shutter speed, lighting, background, and camera distance; as well as other methodological aspects. Also described are the measurement and modeling of the gravitational, drag, and Magnus forces on 1) a spherical projectile undergoing one-dimensional motion and 2) a spinning spherical projectile undergoing motion within a plane. Measurement and correction methods are devised for four common, major sources of error: parallax, lens distortion, discretization, and improper scaling.

  10. Model-free adaptive control optimization using a chaotic particle swarm approach

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br; Rodrigues Coelho, Antonio Augusto [Department of Automation and Systems, Federal University of Santa Catarina, Box 476, 88040-900 Florianopolis, Santa Catarina (Brazil)], E-mail: aarc@das.ufsc.br

    2009-08-30

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with

  11. Certification-Based Process Analysis

    Science.gov (United States)

    Knight, Russell L.

    2013-01-01

    Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.

  12. An Novel Improved Model Free Control Against Actuator Saturation%考虑执行器饱和的改进无模型自适应控制

    Institute of Scientific and Technical Information of China (English)

    程志强; 朱纪洪; 袁夏明

    2016-01-01

    Model free adaptive control (MFAC) is a data-driven based control approach. The advantages of this method lie in low computational complexity, strong robustness and no-need of modeling during its design progress. However, actuator saturation is a problem which is not yet considered in all of the existing MFAC methods. In this paper, a novel improved MFAC method is proposed to deal with the constrains of actuator. Hildreth method is used to solve control output by introducing constraint condition for the critical function of control input, which simplifies the programming progress and reduces the computing load. After that, the stability of the closed-loop system is proved through rigorous analysis. At the end, taking Wood/Berry distillation as the plant, a series of comparative simulation is conducted and the result shows a better performance by using the proposed controller than traditional MFAC methods when actuator saturation exists.%无模型自适应控制(Model free adaptive control, MFAC)是一种数据驱动的控制方法,具有计算简单、鲁棒性强、无需建模等优点。目前无模型自适应控制方法普遍未考虑可能出现的执行器饱和问题。本文针对这一问题,对执行器执行能力存在上限的情况设计了改进算法。该算法通过对控制输入准则函数引入约束条件,使用Hildreth 方法进行数值求解,具有编程简单、计算量小的优点。在此基础上分析并证明了闭环稳定性。最后以蒸馏塔模型为控制对象,通过对比仿真实验验证了算法的有效性。

  13. Model-free kinetics applied to volatilization of Brazilian sunflower oil, and its

    OpenAIRE

    2010-01-01

    Artigo publicado no Periódico Thermochimica Acta e também disponível em: www.elsevier.com/locate/tca Model-free kinetic studies for volatilization of Brazilian sunflower oil and its respective biodiesel were carried out. The biodiesel was obtained by the methylic route using potassium hydroxide as catalyst. Both sunflower oil and biodiesel were characterized by physicochemical analyses, gas chromatography, simulated distillation and thermogravimetry. The physicochemical properties...

  14. Probabilistic Model-Based Safety Analysis

    CERN Document Server

    Güdemann, Matthias; 10.4204/EPTCS.28.8

    2010-01-01

    Model-based safety analysis approaches aim at finding critical failure combinations by analysis of models of the whole system (i.e. software, hardware, failure modes and environment). The advantage of these methods compared to traditional approaches is that the analysis of the whole system gives more precise results. Only few model-based approaches have been applied to answer quantitative questions in safety analysis, often limited to analysis of specific failure propagation models, limited types of failure modes or without system dynamics and behavior, as direct quantitative analysis is uses large amounts of computing resources. New achievements in the domain of (probabilistic) model-checking now allow for overcoming this problem. This paper shows how functional models based on synchronous parallel semantics, which can be used for system design, implementation and qualitative safety analysis, can be directly re-used for (model-based) quantitative safety analysis. Accurate modeling of different types of proba...

  15. Extensions in model-based system analysis

    OpenAIRE

    Graham, Matthew R.

    2007-01-01

    Model-based system analysis techniques provide a means for determining desired system performance prior to actual implementation. In addition to specifying desired performance, model-based analysis techniques require mathematical descriptions that characterize relevant behavior of the system. The developments of this dissertation give ex. tended formulations for control- relevant model estimation as well as model-based analysis conditions for performance requirements specified as frequency do...

  16. Model-free adaptive control of supercritical circulating fluidized-bed boilers

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L

    2014-12-16

    A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  17. A unified model-free controller for switching minimum phase, non-minimum phase and time-delay systems

    CERN Document Server

    Michel, Loïc

    2012-01-01

    This preliminary work presents a simple derivation of the standard model-free control in order to control switching minimum phase, non-minimum phase and time-delay systems. The robustness of the proposed method is studied in simulation.

  18. Discrete-time dynamic graphical games:model-free reinforcement learning solution

    Institute of Scientific and Technical Information of China (English)

    Mohammed I ABOUHEAF; Frank L LEWIS; Magdi S MAHMOUD; Dariusz G MIKULSKI

    2015-01-01

    This paper introduces a model-free reinforcement learning technique that is used to solve a class of dynamic games known as dynamic graphical games. The graphical game results from multi-agent dynamical systems, where pinning control is used to make all the agents synchronize to the state of a command generator or a leader agent. Novel coupled Bellman equations and Hamiltonian functions are developed for the dynamic graphical games. The Hamiltonian mechanics are used to derive the necessary conditions for optimality. The solution for the dynamic graphical game is given in terms of the solution to a set of coupled Hamilton-Jacobi-Bellman equations developed herein. Nash equilibrium solution for the graphical game is given in terms of the solution to the underlying coupled Hamilton-Jacobi-Bellman equations. An online model-free policy iteration algorithm is developed to learn the Nash solution for the dynamic graphical game. This algorithm does not require any knowledge of the agents’ dynamics. A proof of convergence for this multi-agent learning algorithm is given under mild assumption about the inter-connectivity properties of the graph. A gradient descent technique with critic network structures is used to implement the policy iteration algorithm to solve the graphical game online in real-time.

  19. Comparison of Uncalibrated Model-free Visual Servoing Methods for Small-amplitude Movements: A Simulation Study

    Directory of Open Access Journals (Sweden)

    Josip Musić

    2014-07-01

    Full Text Available The paper compares the performance of several methods used for the estimation of an image Jacobian matrix in uncalibrated model-free visual servoing. This was achieved for an eye-in-hand configuration with small-amplitude movements with several sets of system parameters. The tested methods included the Broyden algorithm, Kalman and particle filters as well as the recently proposed population-based algorithm. The algorithms were tested in a simulation environment (Peter Corke’s Robotic Toolbox for MATLAB on a PUMA 560 robot. Several application scenarios were considered, including static point and dynamic trajectory tracking, with several characteristic shapes and three different speeds. Based on the obtained results, conclusions were drawn about the strengths and weaknesses of each method both for a particular setup and in general. Algorithm-switching was introduced and explored, since it might be expected to improve overall robot tracking performance with respect to the desired trajectory. Finally, possible future research directions are suggested.

  20. Model free isoconversional procedure for evaluating the effective activation energy values of thermally stimulated processes in dinitroimidazoles.

    Science.gov (United States)

    Ravi, P

    2014-05-28

    The decomposition kinetics of 1,4-dinitroimidazole, 2,4-dinitroimidazole, and N-methyl-2,4-dinitroimidazole have been investigated using thermogravimetry-differential thermal analysis technique under N2 atmosphere at the flow rate 100 cm(3)/min. The Flynn-Wall-Ozawa method and the Friedman method were used for the estimation of the effective activation energy values. These model free isoconversional kinetic methods showed variation in the calculated values due to the approximation of temperature integral used in the derivations of the kinetic equations. The model compounds were decomposed by multi-step kinetics evident from the nonlinear relationship of the effective activation energy values with the conversion rate. Three different reaction pathways namely NO2 elimination, NO elimination, and HONO elimination are expected to play crucial role in the decomposition of nitroimidazoles. The model dinitroimidazoles represent different decomposition kinetics, and the reaction pathways the NO2 elimination, and NO elimination compete with each other for the decomposition mechanism. The present study is certainly helpful in understanding the decomposition kinetics, and dynamics of substituted nitroimidazoles to be used for fuel, and explosive applications.

  1. Thermogravimetric and model-free kinetic studies on CO2 gasification of low-quality, high-sulphur Indian coals

    Science.gov (United States)

    Das, Tonkeswar; Saikia, Ananya; Mahanta, Banashree; Choudhury, Rahul; Saikia, Binoy K.

    2016-10-01

    Coal gasification with CO2 has emerged as a cleaner and more efficient way for the production of energy, and it offers the advantages of CO2 mitigation policies through simultaneous CO2 sequestration. In the present investigation, a feasibility study on the gasification of three low-quality, high-sulphur coals from the north-eastern region (NER) of India in a CO2 atmosphere using thermogravimetric analysis (TGA-DTA) has been made in order to have a better understanding of the physical and chemical characteristics in the process of gasification of coal. Model-free kinetics was applied to determine the activation energies (E) and pre-exponential factors (A) of the CO2 gasification process of the coals. Multivariate non-linear regression analyses were performed to find out the formal mechanisms, kinetic model, and the corresponding kinetic triplets. The results revealed that coal gasification with CO2 mainly occurs in the temperature range of 800∘-1400∘C and a maximum of at around 1100∘C. The reaction mechanisms responsible for CO2 gasification of the coals were observed to be of the ` nth order with autocatalysis (CnB)' and ` nth order (Fn) mechanism'. The activation energy of the CO2 gasification was found to be in the range 129.07-146.81 kJ mol-1.

  2. Thermogravimetric and model-free kinetic studies on CO2 gasification of low-quality, high-sulphur Indian coals

    Indian Academy of Sciences (India)

    Tonkeswar Das; Ananya Saikia; Banashree Mahanta; Rahul Choudhury; Binoy K Saikia

    2016-10-01

    Coal gasification with CO$_2$ has emerged as a cleaner and more efficient way for the production of energy, and it offers the advantages of CO$_2$ mitigation policies through simultaneous CO$_2$ sequestration. In the present investigation, a feasibility study on the gasification of three low-quality, high-sulphur coals fromthe north-eastern region (NER) of India in a CO$_2$ atmosphere using thermogravimetric analysis (TGADTA) has been made in order to have a better understanding of the physical and chemical characteristics in the process of gasification of coal. Model-free kinetics was applied to determine the activation energies (E) and pre-exponential factors (A) of the CO$_2$ gasification process of the coals. Multivariate nonlinear regression analyses were performed to find out the formal mechanisms, kinetic model, and the corresponding kinetic triplets. The results revealed that coal gasification with CO$_2$ mainly occurs in the temperature range of 800◦–1400◦C and a maximum of at around 1100◦C. The reaction mechanisms responsible for CO$_2$ gasification of the coals were observed to be of the ‘nth order with autocatalysis (CnB)’ and ‘nth order (Fn) mechanism’. The activation energy of the CO$_2$ gasification was found to be in the range 129.07–146.81 kJ mol$^{−1}$.

  3. Towards a Judgement-Based Statistical Analysis

    Science.gov (United States)

    Gorard, Stephen

    2006-01-01

    There is a misconception among social scientists that statistical analysis is somehow a technical, essentially objective, process of decision-making, whereas other forms of data analysis are judgement-based, subjective and far from technical. This paper focuses on the former part of the misconception, showing, rather, that statistical analysis…

  4. Modelling free surface aquifers to analyze the interaction between groundwater and sinuous streams

    DEFF Research Database (Denmark)

    Balbarini, Nicola; Boon, W. M.; Bjerg, Poul Løgstrup;

    Several mathematical methods for modelling free surface aquifers are available. Aquifer-stream interaction is an important application of these models, and are challenging to simulate because stream interaction is described by a highly variable head boundary, which can cause numerical instabilities...... and errors. In addition, when streams are sinuous, groundwater flow is truly 3-dimensional, with strong vertical flows and sharp changes in horizontal direction. Here 3 different approaches to simulating free surface aquifers are compared for simulating groundwater-stream interaction. The aim of the models...... was to investigate the effect of meander bends on the spatial and temporal variability of aquifer-stream interaction, and to develop a new 3D conceptual model of groundwater-stream interaction. Three mathematical methods were tested, representing the three main methods available for modeling 3D unconfined aquifers...

  5. Reachability Analysis of Sampling Based Planners

    NARCIS (Netherlands)

    Geraerts, R.J.; Overmars, M.H.

    2005-01-01

    The last decade, sampling based planners like the Probabilistic Roadmap Method have proved to be successful in solving complex motion planning problems. We give a reachability based analysis for these planners which leads to a better understanding of the success of the approach and enhancements of t

  6. Analysis of Enhanced Associativity Based Routing Protocol

    Directory of Open Access Journals (Sweden)

    Said A. Shaar

    2006-01-01

    Full Text Available This study introduces an analysis to the performance of the Enhanced Associativity Based Routing protocol (EABR based on two factors; Operation complexity (OC and Communication Complexity (CC. OC can be defined as the number of steps required in performing a protocol operation, while CC can be defined as the number of messages exchanged in performing a protocol operation[1]. The values represent the worst-case analysis. The EABR has been analyzed based on CC and OC and the results have been compared with another routing technique called ABR. The results have shown that EABR can perform better than ABR in many circumstances during the route reconstruction.

  7. Network Analysis of the Shanghai Stock Exchange Based on Partial Mutual Information

    Directory of Open Access Journals (Sweden)

    Tao You

    2015-06-01

    Full Text Available Analyzing social systems, particularly financial markets, using a complex network approach has become one of the most popular fields within econophysics. A similar trend is currently appearing within the econometrics and finance communities, as well. In this study, we present a state-of-the-artmethod for analyzing the structure and risk within stockmarkets, treating them as complex networks using model-free, nonlinear dependency measures based on information theory. This study is the first network analysis of the stockmarket in Shanghai using a nonlinear network methodology. Further, it is often assumed that markets outside the United States and Western Europe are inherently riskier. We find that the Chinese stock market is not structurally risky, contradicting this popular opinion. We use partial mutual information to create filtered networks representing the Shanghai stock exchange, comparing them to networks based on Pearson’s correlation. Consequently, we discuss the structure and characteristics of both the presented methods and the Shanghai stock exchange. This paper provides an insight into the cutting edge methodology designed for analyzing complex financial networks, as well as analyzing the structure of the market in Shanghai and, as such, is of interest to both researchers and financial analysts.

  8. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    B Yegnanarayana; Suryakanth V Gangashetty

    2011-10-01

    Speech analysis is traditionally performed using short-time analysis to extract features in time and frequency domains. The window size for the analysis is fixed somewhat arbitrarily, mainly to account for the time varying vocal tract system during production. However, speech in its primary mode of excitation is produced due to impulse-like excitation in each glottal cycle. Anchoring the speech analysis around the glottal closure instants (epochs) yields significant benefits for speech analysis. Epoch-based analysis of speech helps not only to segment the speech signals based on speech production characteristics, but also helps in accurate analysis of speech. It enables extraction of important acoustic-phonetic features such as glottal vibrations, formants, instantaneous fundamental frequency, etc. Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, and methods to extract the epoch information are reviewed. Applications of epoch extraction for some speech applications are demonstrated.

  9. Identifying Proper Names Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    The issue of proper names recognition in Chinese text was discussed. An automatic approach based on association analysis to extract rules from corpus was presented. The method tries to discover rules relevant to external evidence by association analysis, without additional manual effort. These rules can be used to recognize the proper nouns in Chinese texts. The experimental result shows that our method is practical in some applications.Moreover, the method is language independent.

  10. Cloud Based Development Issues: A Methodical Analysis

    Directory of Open Access Journals (Sweden)

    Sukhpal Singh

    2012-11-01

    Full Text Available Cloud based development is a challenging task for various software engineering projects, especifically for those which demand extraordinary quality, reusability and security along with general architecture. In this paper we present a report on a methodical analysis of cloud based development problems published in major computer science and software engineering journals and conferences organized by various researchers. Research papers were collected from different scholarly databases using search engines within a particular period of time. A total of 89 research papers were analyzed in this methodical study and we categorized into four classes according to the problems addressed by them. The majority of the research papers focused on quality (24 papers associated with cloud based development and 16 papers focused on analysis and design. By considering the areas focused by existing authors and their gaps, untouched areas of cloud based development can be discovered for future research works.

  11. Security Analysis of Discrete Logarithm Based Cryptosystems

    Institute of Scientific and Technical Information of China (English)

    WANG Yuzhu; LIAO Xiaofeng

    2006-01-01

    Discrete logarithm based cryptosystems have subtle problems that make the schemes vulnerable. This paper gives a comprehensive listing of security issues in the systems and analyzes three classes of attacks which are based on mathematical structure of the group which is used in the schemes, the disclosed information of the subgroup and implementation details respectively. The analysis will, in turn, allow us to motivate protocol design and implementation decisions.

  12. Social Network Analysis Based on Network Motifs

    OpenAIRE

    2014-01-01

    Based on the community structure characteristics, theory, and methods of frequent subgraph mining, network motifs findings are firstly introduced into social network analysis; the tendentiousness evaluation function and the importance evaluation function are proposed for effectiveness assessment. Compared with the traditional way based on nodes centrality degree, the new approach can be used to analyze the properties of social network more fully and judge the roles of the nodes effectively. I...

  13. Examples of model-free implant restorations using Cerec inLab 4.0 software.

    Science.gov (United States)

    Reich, S; Schley, J; Kern, T; Fiedler, K; Wolfart, S

    2012-01-01

    This case report demonstrates two ways to fabricate model-free implant restorations with the Cerec inLab 4.0 software. Because the patient, a woman with a history of periodontal disease, did not wish to have a removable partial denture, implant therapy was planned for the restoration of her edentulous areas 14/15 and 24/25. In addition, the restoration was to provide functional relief of the natural maxillary anterior teeth. The two implants for the first quadrant were planned as single-tooth restorations. Each was designed as a full contour implant supra-structure using the Cerec Biogeneric abutment design technique. After completing the design phase, each restoration proposal was split into two parts: a zirconia abutment and a lithium disilicate crown. For the restoration of the second quadrant, custom 20-degree-angled abutments were individualized and acquired with the Cerec camera. A block crown was then designed, milled in burn-out acrylic resin, and fabricated from a lithium disilicate glass-ceramic ingot according to the press ceramic technique. Additionally methods of provisional restorations are discussed.

  14. Kinetic modelling of RDF pyrolysis: Model-fitting and model-free approaches.

    Science.gov (United States)

    Çepelioğullar, Özge; Haykırı-Açma, Hanzade; Yaman, Serdar

    2016-02-01

    In this study, refuse derived fuel (RDF) was selected as solid fuel and it was pyrolyzed in a thermal analyzer from room temperature to 900°C at heating rates of 5, 10, 20, and 50°C/min in N2 atmosphere. The obtained thermal data was used to calculate the kinetic parameters using Coats-Redfern, Friedman, Flylnn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS) methods. As a result of Coats-Redfern model, decomposition process was assumed to be four independent reactions with different reaction orders. On the other hand, model free methods demonstrated that activation energy trend had similarities for the reaction progresses of 0.1, 0.2-0.7 and 0.8-0.9. The average activation energies were found between 73-161kJ/mol and it is possible to say that FWO and KAS models produced closer results to the average activation energies compared to Friedman model. Experimental studies showed that RDF may be a sustainable and promising feedstock for alternative processes in terms of waste management strategies.

  15. Abstraction based Analysis and Arbiter Synthesis

    DEFF Research Database (Denmark)

    Ernits, Juhan-Peep; Yi, Wang

    2004-01-01

    The work focuses on the analysis of an example of synchronous systems containing FIFO buffers, registers and memory interconnected by several private and shared busses. The example used in this work is based on a Terma radar system memory interface case study from the IST AMETIST project....

  16. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  17. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  18. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Nielsen, Mads; Lo, Pechin Chien Pau

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based...... on measured lung function instead of on manually annotated regions of interest (ROIs). A quantitative measure of COPD is obtained by fusing COPD probabilities computed in ROIs within the lung fields where the individual ROI probabilities are computed using a k nearest neighbor (kNN ) classifier. The distance...... and subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density...

  19. Web-based pre-Analysis Tools

    CERN Document Server

    Moskalets, Tetiana

    2014-01-01

    The project consists in the initial development of a web based and cloud computing services to allow students and researches to perform fast and very useful cut-based pre-analysis on a browser, using real data and official Monte-Carlo simulations (MC). Several tools are considered: ROOT files filter, JavaScript Multivariable Cross-Filter, JavaScript ROOT browser and JavaScript Scatter-Matrix Libraries. Preliminary but satisfactory results have been deployed online for test and future upgrades.

  20. Measuring Class Cohesion Based on Dependence Analysis

    Institute of Scientific and Technical Information of China (English)

    Zhen-Qiang Chen; Bao-Wen Xu; Yu-Ming Zhou

    2004-01-01

    Classes are the basic modules in object-oriented (OO) software, which consist of attributes and methods. Thus, in OO environment, the cohesion is mainly about the tightness of the attributes and methods of classes. This paper discusses the relationships between attributes and attributes, attributes and methods, methods and methods of a class based on dependence analysis. Then the paper presents methods to compute these dependencies. Based on these, the paper proposes a method to measure the class cohesion, which satisfies the properties that a good measurement should have. The approach overcomes the limitations of previous class cohesion measures, which consider only one or two of the three relationships in a class.

  1. TEST COVERAGE ANALYSIS BASED ON PROGRAM SLICING

    Institute of Scientific and Technical Information of China (English)

    Chen Zhenqiang; Xu Baowen; Guanjie

    2003-01-01

    Coverage analysis is a structural testing technique that helps to eliminate gaps in atest suite and determines when to stop testing. To compute test coverage, this letter proposes anew concept coverage about variables, based on program slicing. By adding powers accordingto their importance, the users can focus on the important variables to obtain higher test coverage.The letter presents methods to compute basic coverage based on program structure graphs. Inmost cases, the coverage obtained in the letter is bigger than that obtained by a traditionalmeasure, because the coverage about a variable takes only the related codes into account.

  2. Musical Structural Analysis Database Based on GTTM

    OpenAIRE

    Hamanaka, Masatoshi; Hirata, Keiji; Tojo, Satoshi

    2014-01-01

    This paper, we present the publication of our analysis data and analyzing tool based on the generative theory of tonal music (GTTM). Musical databases such as score databases, instrument sound databases, and musical pieces with standard MIDI files and annotated data are key to advancements in the field of music information technology. We started implementing the GTTM on a computer in 2004 and ever since have collected and publicized test data by musicologists in a step-by-step manner. In our ...

  3. Blind source separation for groundwater pressure analysis based on nonnegative matrix factorization

    Science.gov (United States)

    Alexandrov, Boian S.; Vesselinov, Velimir V.

    2014-09-01

    The identification of the physical sources causing spatial and temporal fluctuations of aquifer water levels is a challenging, yet a very important hydrogeological task. The fluctuations can be caused by variations in natural and anthropogenic sources such as pumping, recharge, barometric pressures, etc. The source identification can be crucial for conceptualization of the hydrogeological conditions and characterization of aquifer properties. We propose a new computational framework for model-free inverse analysis of pressure transients based on Nonnegative Matrix Factorization (NMF) method for Blind Source Separation (BSS) coupled with k-means clustering algorithm, which we call NMFk. NMFk is capable of identifying a set of unique sources from a set of experimentally measured mixed signals, without any information about the sources, their transients, and the physical mechanisms and properties controlling the signal propagation through the subsurface flow medium. Our analysis only requires information about pressure transients at a number of observation points, m, where m≥r, and r is the number of unknown unique sources causing the observed fluctuations. We apply this new analysis on a data set from the Los Alamos National Laboratory site. We demonstrate that the sources identified by NMFk have real physical origins: barometric pressure and water-supply pumping effects. We also estimate the barometric pressure efficiency of the monitoring wells. The possible applications of the NMFk algorithm are not limited to hydrogeology problems; NMFk can be applied to any problem where temporal system behavior is observed at multiple locations and an unknown number of physical sources are causing these fluctuations.

  4. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  5. STELLAR LOCI II. A MODEL-FREE ESTIMATE OF THE BINARY FRACTION FOR FIELD FGK STARS

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Haibo; Liu, Xiaowei [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China); Xiang, Maosheng; Huang, Yang; Chen, Bingqiu [Department of Astronomy, Peking University, Beijing 100871 (China); Wu, Yue [Key Laboratory of Optical Astronomy, National Astronomical Observatories, Chinese Academy of Sciences, Beijing 100012 (China); Hou, Yonghui; Zhang, Yong, E-mail: yuanhb4861@pku.edu.cn, E-mail: x.liu@pku.edu.cn [Nanjing Institute of Astronomical Optics and Technology, National Astronomical Observatories, Chinese Academy of Sciences, Nanjing 210042 (China)

    2015-02-01

    We propose a stellar locus outlier (SLOT) method to determine the binary fraction of main-sequence stars statistically. The method is sensitive to neither the period nor mass ratio distributions of binaries and is able to provide model-free estimates of binary fraction for large numbers of stars of different populations in large survey volumes. We have applied the SLOT method to two samples of stars from the Sloan Digital Sky Survey (SDSS) Stripe 82, constructed by combining the recalibrated SDSS photometric data with the spectroscopic information from the SDSS and LAMOST surveys. For the SDSS spectroscopic sample, we find an average binary fraction for field FGK stars of 41% ± 2%. The fractions decrease toward late spectral types and are 44% ± 5%, 43% ± 3%, 35% ± 5%, and 28% ± 6% for stars with g – i colors in the range 0.3-0.6 mag, 0.6-0.9 mag, 0.9-1.2 mag, and 1.2-1.6 mag, respectively. A modest metallicity dependence is also found. The fraction decreases with increasing metallicity. For stars with [Fe/H] between –0.5 and 0.0 dex, –1.0 and –0.5 dex, –1.5 and –1.0 dex, and –2.0 and –1.5 dex, the inferred binary fractions are 37% ± 3%, 39% ± 3%, 50% ± 9%, and 53% ± 20%, respectively. We have further divided the sample into stars from the thin disk, the thick disk, the transition zone between them, and the halo. The results suggest that the Galactic thin and thick disks have comparable binary fractions, whereas the Galactic halo contains a significantly larger fraction of binaries. Applying the method to the LAMOST spectroscopic sample yields consistent results. Finally, other potential applications and future work with the method are discussed.

  6. Model-free reconstruction of excitatory neuronal connectivity from calcium imaging signals.

    Directory of Open Access Journals (Sweden)

    Olav Stetter

    Full Text Available A systematic assessment of global neural network connectivity through direct electrophysiological assays has remained technically infeasible, even in simpler systems like dissociated neuronal cultures. We introduce an improved algorithmic approach based on Transfer Entropy to reconstruct structural connectivity from network activity monitored through calcium imaging. We focus in this study on the inference of excitatory synaptic links. Based on information theory, our method requires no prior assumptions on the statistics of neuronal firing and neuronal connections. The performance of our algorithm is benchmarked on surrogate time series of calcium fluorescence generated by the simulated dynamics of a network with known ground-truth topology. We find that the functional network topology revealed by Transfer Entropy depends qualitatively on the time-dependent dynamic state of the network (bursting or non-bursting. Thus by conditioning with respect to the global mean activity, we improve the performance of our method. This allows us to focus the analysis to specific dynamical regimes of the network in which the inferred functional connectivity is shaped by monosynaptic excitatory connections, rather than by collective synchrony. Our method can discriminate between actual causal influences between neurons and spurious non-causal correlations due to light scattering artifacts, which inherently affect the quality of fluorescence imaging. Compared to other reconstruction strategies such as cross-correlation or Granger Causality methods, our method based on improved Transfer Entropy is remarkably more accurate. In particular, it provides a good estimation of the excitatory network clustering coefficient, allowing for discrimination between weakly and strongly clustered topologies. Finally, we demonstrate the applicability of our method to analyses of real recordings of in vitro disinhibited cortical cultures where we suggest that excitatory connections

  7. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  8. Distributed Learning, Extremum Seeking, and Model-Free Optimization for the Resilient Coordination of Multi-Agent Adversarial Groups

    Science.gov (United States)

    2016-09-07

    AFRL-AFOSR-VA-TR-2016-0314 Distributed learning , extremum seeking, and model-free optimization for the resilient coordination of multi-agent...reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching...existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information . Send comments regarding

  9. Electric Equipment Diagnosis based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Stavitsky Sergey A.

    2016-01-01

    Full Text Available Due to electric equipment development and complication it is necessary to have a precise and intense diagnosis. Nowadays there are two basic ways of diagnosis: analog signal processing and digital signal processing. The latter is more preferable. The basic ways of digital signal processing (Fourier transform and Fast Fourier transform include one of the modern methods based on wavelet transform. This research is dedicated to analyzing characteristic features and advantages of wavelet transform. This article shows the ways of using wavelet analysis and the process of test signal converting. In order to carry out this analysis, computer software Mathcad was used and 2D wavelet spectrum for a complex function was created.

  10. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  11. Constructing storyboards based on hierarchical clustering analysis

    Science.gov (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  12. Arabic Interface Analysis Based on Cultural Markers

    CERN Document Server

    Khanum, Mohammadi Akheela; Chaurasia, Mousmi A

    2012-01-01

    This study examines the Arabic interface design elements that are largely influenced by the cultural values. Cultural markers are examined in websites from educational, business, and media. Cultural values analysis is based on Geert Hofstede's cultural dimensions. The findings show that there are cultural markers which are largely influenced by the culture and that the Hofstede's score for Arab countries is partially supported by the website design components examined in this study. Moderate support was also found for the long term orientation, for which Hoftsede has no score.

  13. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  14. Model-free Adaptive Control With Tight Format of Linear Motor%直线电机的紧格式无模型自适应控制

    Institute of Scientific and Technical Information of China (English)

    李萍; 曹健

    2014-01-01

    将基于紧格式线性化的非线性系统无模型自适应控制方法应用于直线电机的控制中畅利用伪偏导数和伪阶数的概念,用紧格式动态线性时变模型替代直线电机非线性系统模型畅根据直线电机运动模型的输入输出数据在线估计系统的伪偏导数。仿真实验表明,紧格式无模型自适应控制器对电机这种具有不确知动态的非线性系统有较强的自适应性、抗干扰性、稳定性和鲁棒性,解决了直线电机非线性和不确定性的控制问题。%The model-free adaptive control ( MFAC) approach of nonlinear systems based on linearization of tight format was applied to control of the linear motor.Used the concept of pseudo partial derivative and pseudo order , nonlinear system model of linear motor was replaced with tight format dynamic linear time-var-ying model.According to the linear motor motion model of input and output data online estimation of pseudo partial derivative.The simulation results show that tight format model-free controller has a strong adaptive , anti-interference , stable and robustness for motor with vaguely known dynamic nonlinear systems , solved the problem of controlling the nonlinear and uncertainty of linear motor.

  15. Visibility Graph Based Time Series Analysis

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it’s microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks. PMID:26571115

  16. Monitoring composite curing using nonlinear model-free estimators with infrared spectroscopy data

    Science.gov (United States)

    Tung, Sanhuang

    Composite materials (e.g., fiber-reinforced epoxy resin) have many advantages over conventional materials. However, costs associated with product scraps and post-process inspections make them too expensive to be widely used. The key to lowering the cost is the knowledge of online product information, which is very difficult to estimate by mathematical models due to the complexity of simultaneous competing curing reactions. Realtime process monitoring techniques using nondestructive evaluation (NDE) sensors offer a more feasible approach to obtain this information. This dissertation is a study of methods for predicting mix ratio and degree-of-cure in epoxy/amine curing processes. In situ IR spectroscopy was the NDE technique used and all data were collected from small-scale experiments simulating the resin transfer molding (RTM) process at three levels of curing temperatures and four levels of mix ratios. The RTM process could be described as two major phases, i.e., mixing/injection and curing. These two phases lead to the two problems we studied. The first problem studied is to predict compositions of epoxy/amine mixtures. The second problem, which is more important, is to predict degree-of-cure during epoxy curing. In situ IR spectroscopy data were used as the predicting variables to predict quality properties of interest. We studied the issues of dealing with complex IR spectra, which, in our case, have 1751 wavenumbers. and developed new nonlinear estimators that make more accurate predictions than current estimators. Dimension reduction and nonlinear mapping are two important steps to make accurate predictions from complex IR spectroscopy data. We evaluated current techniques of reducing dimensionality by extracting features from correlated data and removing irrelevant data. We also evaluated linear and nonlinear mapping techniques from both projection-based and kernel-based methods. Our research came up with better estimators with respect to prediction

  17. Model-free functional connectivity and impulsivity correlates of alcohol dependence: a resting-state study.

    Science.gov (United States)

    Zhu, Xi; Cortes, Carlos R; Mathur, Karan; Tomasi, Dardo; Momenan, Reza

    2017-01-01

    Alcohol dependence is characterized by impulsiveness toward consumption despite negative consequences. Although neuro-imaging studies have implicated some regions underlying this disorder, there is little information regarding its large-scale connectivity pattern. This study investigated the within- and between-network functional connectivity (FC) in alcohol dependence and examined its relationship with clinical impulsivity measures. Using probabilistic independent component analysis on resting-state functional magnetic resonance imaging (rs-fMRI) data from 25 alcohol-dependent (AD) and 26 healthy control (HC) participants, we compared the within- and between-network FC between AD and HC. Then, the relationship between FC and impulsiveness as measured by the Barratt Impulsiveness Scale (BIS-11), the UPPS-P Impulsive Scale and the delay discounting task (DDT), was explored. Compared with HC, AD exhibited increased within-network FC in salience (SN), default mode (DMN), orbitofrontal cortex (OFCN), left executive control (LECN) and amygdala-striatum (ASN) networks. Increased between-network FC was found among LECN, ASN and SN. Between-network FC correlations were significantly negative between Negative-Urgency and OFCN pairs with right executive control network (RECN), anterior DMN (a-DMN) and posterior DMN (p-DMN) in AD. DDT was significantly correlated with the between-network FC among the LECN, a-DMN and SN in AD. These findings add evidence to the concept of altered within-network FC and also highlight the role of between-network FC in the pathophysiology of AD. Additionally, this study suggests differential neurobiological bases for different clinical measures of impulsivity that may be used as a systems-level biomarker for alcohol dependence severity and treatment efficacy.

  18. Watermark Resistance Analysis Based On Linear Transformation

    Directory of Open Access Journals (Sweden)

    N.Karthika Devi

    2012-06-01

    Full Text Available Generally, digital watermark can be embedded in any copyright image whose size is not larger than it. The watermarking schemes can be classified into two categories: spatial domain approach or transform domain approach. Previous works have shown that the transform domain scheme is typically more robust to noise, common image processing, and compression when compared with the spatial transform scheme. Improvements in performance of watermarking schemes can be obtained by exploiting the characteristics of the human visual system (HVS in the watermarking process. We propose a linear transformation based watermarking algorithm. The watermarking bits are embedded into cover image to produce watermarked image. The efficiency of watermark is checked using pre-defined attacks. Attack resistance analysis is done using BER (Bit Error Rate calculation. Finally, the Quality of the watermarked image can be obtained.

  19. Web Template Extraction Based on Hyperlink Analysis

    Directory of Open Access Journals (Sweden)

    Julián Alarte

    2015-01-01

    Full Text Available Web templates are one of the main development resources for website engineers. Templates allow them to increase productivity by plugin content into already formatted and prepared pagelets. For the final user templates are also useful, because they provide uniformity and a common look and feel for all webpages. However, from the point of view of crawlers and indexers, templates are an important problem, because templates usually contain irrelevant information such as advertisements, menus, and banners. Processing and storing this information is likely to lead to a waste of resources (storage space, bandwidth, etc.. It has been measured that templates represent between 40% and 50% of data on the Web. Therefore, identifying templates is essential for indexing tasks. In this work we propose a novel method for automatic template extraction that is based on similarity analysis between the DOM trees of a collection of webpages that are detected using menus information. Our implementation and experiments demonstrate the usefulness of the technique.

  20. Visual Similarity Based Document Layout Analysis

    Institute of Scientific and Technical Information of China (English)

    Di Wen; Xiao-Qing Ding

    2006-01-01

    In this paper, a visual similarity based document layout analysis (DLA) scheme is proposed, which by using clustering strategy can adaptively deal with documents in different languages, with different layout structures and skew angles. Aiming at a robust and adaptive DLA approach, the authors first manage to find a set of representative filters and statistics to characterize typical texture patterns in document images, which is through a visual similarity testing process.Texture features are then extracted from these filters and passed into a dynamic clustering procedure, which is called visual similarity clustering. Finally, text contents are located from the clustered results. Benefit from this scheme, the algorithm demonstrates strong robustness and adaptability in a wide variety of documents, which previous traditional DLA approaches do not possess.

  1. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  2. Interactive analysis of geodata based intelligence

    Science.gov (United States)

    Wagner, Boris; Eck, Ralf; Unmüessig, Gabriel; Peinsipp-Byma, Elisabeth

    2016-05-01

    When a spatiotemporal events happens, multi-source intelligence data is gathered to understand the problem, and strategies for solving the problem are investigated. The difficulties arising from handling spatial and temporal intelligence data represent the main problem. The map might be the bridge to visualize the data and to get the most understand model for all stakeholders. For the analysis of geodata based intelligence data, a software was developed as a working environment that combines geodata with optimized ergonomics. The interaction with the common operational picture (COP) is so essentially facilitated. The composition of the COP is based on geodata services, which are normalized by international standards of the Open Geospatial Consortium (OGC). The basic geodata are combined with intelligence data from images (IMINT) and humans (HUMINT), stored in a NATO Coalition Shared Data Server (CSD). These intelligence data can be combined with further information sources, i.e., live sensors. As a result a COP is generated and an interaction suitable for the specific workspace is added. This allows the users to work interactively with the COP, i.e., searching with an on board CSD client for suitable intelligence data and integrate them into the COP. Furthermore, users can enrich the scenario with findings out of the data of interactive live sensors and add data from other sources. This allows intelligence services to contribute effectively to the process by what military and disaster management are organized.

  3. Operating cost analysis of anaesthesia: Activity based costing (ABC analysis

    Directory of Open Access Journals (Sweden)

    Majstorović Branislava M.

    2011-01-01

    Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  4. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  5. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon;

    2014-01-01

    The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific overrepre......The integration of species distributions and evolutionary relationships is one of the most rapidly moving research fields today and has led to considerable advances in our understanding of the processes underlying biogeographical patterns. Here, we develop a set of metrics, the specific...... with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...... of intuitively interpretable patterns that are consistent with current biogeographical knowledge.Importantly, the results are statistically tractable, opening many possibilities for their use in analyses of evolutionary, historical and spatial patterns of species diversity. The method is implemented...

  6. A Translation Case Analysis Based on Skopos Theory

    Institute of Scientific and Technical Information of China (English)

    盖孟姣

    2015-01-01

    This paper is a translation case analysis based on Skopos Theory.This paper choose President Xi’s New Year congratulations of 2015 as analysis text and gives the case analysis.This paper focuses on translating the text based on Skopos Theory.

  7. Using Willie's Acid-Base Box for Blood Gas Analysis

    Science.gov (United States)

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  8. ANALYSIS OF CIRCUIT TOLERANCE BASED ON RANDOM SET THEORY

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Monte Carlo Analysis has been an accepted method for circuit tolerance analysis,but the heavy computational complexity has always prevented its applications.Based on random set theory,this paper presents a simple and flexible tolerance analysis method to estimate circuit yield.It is the alternative to Monte Carlo analysis,but reduces the number of calculations dramatically.

  9. 气动肌肉群驱动球关节机器人的无模型自适应控制%Model-Free Adaptive Control for the Ball-Joint Robot Driven by PMA Group

    Institute of Scientific and Technical Information of China (English)

    刘昱; 王涛; 范伟; 王渝

    2013-01-01

    设计了一种由气动肌肉群驱动的三轴球关节机器人,并对其进行逆运动学分析.根据该机器人的运动学逆解,提出了基于非线性反馈的气动肌肉群控制策略.气动肌肉系统是一个强非线性时变的自平衡系统,鉴于传统控制方法难以克服控制精度、响应速度和稳定性之间的矛盾,引入改进型无模型自适应控制器.实验证明,在保证稳定性的前提下,控制器提高了系统的调节速度和控制精度,使得系统的稳态误差小于0.3°,取得了良好的控制效果.%A triaxial ball-joint robot driven by PMA (pneumatic muscle actuator) group is designed. The inverse kinematics of the robot is analyzed. On the basis of its inverse kinematics analysis, the control strategy of the PMA group based on nonlinear feedback is proposed. The system of PMA is a strongly nonlinear and time-varying self-balanced system. Because it is difficult to overcome the conflict among the regulating speed, the stability and the control accuracy with traditional control methods, an improved model-free adaptive control is introduced. Experiments show that on the premise of ensuring the stability, the control accuracy and the adjusting speed are enhanced. The system's steady-error is less than 0.3?and a satisfactory control result is obtained.

  10. Transfer entropy--a model-free measure of effective connectivity for the neurosciences.

    Science.gov (United States)

    Vicente, Raul; Wibral, Michael; Lindner, Michael; Pipa, Gordon

    2011-02-01

    Understanding causal relationships, or effective connectivity, between parts of the brain is of utmost importance because a large part of the brain's activity is thought to be internally generated and, hence, quantifying stimulus response relationships alone does not fully describe brain dynamics. Past efforts to determine effective connectivity mostly relied on model based approaches such as Granger causality or dynamic causal modeling. Transfer entropy (TE) is an alternative measure of effective connectivity based on information theory. TE does not require a model of the interaction and is inherently non-linear. We investigated the applicability of TE as a metric in a test for effective connectivity to electrophysiological data based on simulations and magnetoencephalography (MEG) recordings in a simple motor task. In particular, we demonstrate that TE improved the detectability of effective connectivity for non-linear interactions, and for sensor level MEG signals where linear methods are hampered by signal-cross-talk due to volume conduction.

  11. Image based performance analysis of thermal imagers

    Science.gov (United States)

    Wegner, D.; Repasi, E.

    2016-05-01

    Due to advances in technology, modern thermal imagers resemble sophisticated image processing systems in functionality. Advanced signal and image processing tools enclosed into the camera body extend the basic image capturing capability of thermal cameras. This happens in order to enhance the display presentation of the captured scene or specific scene details. Usually, the implemented methods are proprietary company expertise, distributed without extensive documentation. This makes the comparison of thermal imagers especially from different companies a difficult task (or at least a very time consuming/expensive task - e.g. requiring the execution of a field trial and/or an observer trial). For example, a thermal camera equipped with turbulence mitigation capability stands for such a closed system. The Fraunhofer IOSB has started to build up a system for testing thermal imagers by image based methods in the lab environment. This will extend our capability of measuring the classical IR-system parameters (e.g. MTF, MTDP, etc.) in the lab. The system is set up around the IR- scene projector, which is necessary for the thermal display (projection) of an image sequence for the IR-camera under test. The same set of thermal test sequences might be presented to every unit under test. For turbulence mitigation tests, this could be e.g. the same turbulence sequence. During system tests, gradual variation of input parameters (e. g. thermal contrast) can be applied. First ideas of test scenes selection and how to assembly an imaging suite (a set of image sequences) for the analysis of imaging thermal systems containing such black boxes in the image forming path is discussed.

  12. Design Intelligent Model-free Hybrid Guidance Controller for Three Dimension Motor

    Directory of Open Access Journals (Sweden)

    Abdol Majid Mirshekaran

    2014-10-01

    Full Text Available The minimum rule base Proportional Integral Derivative (PID Fuzzy hybrid guidance Controller for three dimensions spherical motor is presented in this research. A three dimensions spherical motor is well equipped with conventional control techniques and, in particular, various PID controllers which demonstrate a good performance and successfully solve different guidance problems. Guidance control in a three dimensions spherical motor is performed by the PID controllers producing the control signals which are applied to systems torque. The necessary reference inputs for a PID controller are usually supplied by the system's sensors based on different data. The popularity of PID Fuzzy hybrid guidance Controller can be attributed to their robust performance in a wide range of operating conditions and partly to their functional simplicity. PID methodology has three inputs and if any input is described with seven linguistic values, and any rule has three conditions we will need 343 rules. It is too much work to write 343 rules. In this research the PID-like fuzzy controller can be constructed as a parallel structure of a PD-like fuzzy controller and a conventional PI controller to have the minimum rule base. Linear type PID controller is used to modify PID fuzzy logic theory to design hybrid guidance methodology. This research is used to reduce or eliminate the fuzzy and conventional PID controller problem based on minimum rule base fuzzy logic theory and modified it by PID method to control of spherical motor system and testing of the quality of process control in the simulation environment of MATLAB/SIMULINK Simulator.

  13. A Requirements Analysis Model Based on QFD

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi-wei; Nelson K.H.Tang

    2004-01-01

    The enterprise resource planning (ERP) system has emerged to offer an integrated IT solution and more and more enterprises are increasing by adopting this system and regarding it as an important innovation. However, there is already evidence of high failure risks in ERP project implementation, one major reason is poor analysis of the requirements for system implementation. In this paper, the importance of requirements analysis for ERP project implementation is highlighted, and a requirements analysis model by applying quality function deployment (QFD) is presented, which will support to conduct requirements analysis for ERP project.

  14. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  15. Web Based Distributed Coastal Image Analysis System Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  16. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  17. A Model-Free Diagnosis Approach for Intake Leakage Detection and Characterization in Diesel Engines

    Directory of Open Access Journals (Sweden)

    Ghaleb Hoblos

    2015-07-01

    Full Text Available Feature selection is an essential step for data classification used in fault detection and diagnosis processes. In this work, a new approach is proposed, which combines a feature selection algorithm and a neural network tool for leak detection and characterization tasks in diesel engine air paths. The Chi square classifier is used as the feature selection algorithm and the neural network based on Levenberg-Marquardt is used in system behavior modeling. The obtained neural network is used for leak detection and characterization. The model is learned and validated using data generated by xMOD. This tool is used again for testing. The effectiveness of the proposed approach is illustrated in simulation when the system operates on a low speed/load and the considered leak affecting the air path is very small.

  18. Decision making based on data analysis methods

    OpenAIRE

    Sirola, Miki; Sulkava, Mika

    2016-01-01

    This technical report is based on four our recent articles:"Data fusion of pre-election gallups and polls for improved support estimates", "Analyzing parliamentary elections based on voting advice application data", "The Finnish car rejection reasons shown in an interactive SOM visualization tool", and "Network visualization of car inspection data using graph layout". Neural methods are applied in political and technical decision making. We introduce decision support schemes based on Self-Org...

  19. Model-free information-theoretic approach to infer leadership in pairs of zebrafish

    Science.gov (United States)

    Butail, Sachit; Mwaffo, Violet; Porfiri, Maurizio

    2016-04-01

    Collective behavior affords several advantages to fish in avoiding predators, foraging, mating, and swimming. Although fish schools have been traditionally considered egalitarian superorganisms, a number of empirical observations suggest the emergence of leadership in gregarious groups. Detecting and classifying leader-follower relationships is central to elucidate the behavioral and physiological causes of leadership and understand its consequences. Here, we demonstrate an information-theoretic approach to infer leadership from positional data of fish swimming. In this framework, we measure social interactions between fish pairs through the mathematical construct of transfer entropy, which quantifies the predictive power of a time series to anticipate another, possibly coupled, time series. We focus on the zebrafish model organism, which is rapidly emerging as a species of choice in preclinical research for its genetic similarity to humans and reduced neurobiological complexity with respect to mammals. To overcome experimental confounds and generate test data sets on which we can thoroughly assess our approach, we adapt and calibrate a data-driven stochastic model of zebrafish motion for the simulation of a coupled dynamical system of zebrafish pairs. In this synthetic data set, the extent and direction of the coupling between the fish are systematically varied across a wide parameter range to demonstrate the accuracy and reliability of transfer entropy in inferring leadership. Our approach is expected to aid in the analysis of collective behavior, providing a data-driven perspective to understand social interactions.

  20. Pathway-Based Functional Analysis of Metagenomes

    Science.gov (United States)

    Bercovici, Sivan; Sharon, Itai; Pinter, Ron Y.; Shlomi, Tomer

    Metagenomic data enables the study of microbes and viruses through their DNA as retrieved directly from the environment in which they live. Functional analysis of metagenomes explores the abundance of gene families, pathways, and systems, rather than their taxonomy. Through such analysis researchers are able to identify those functional capabilities most important to organisms in the examined environment. Recently, a statistical framework for the functional analysis of metagenomes was described that focuses on gene families. Here we describe two pathway level computational models for functional analysis that take into account important, yet unaddressed issues such as pathway size, gene length and overlap in gene content among pathways. We test our models over carefully designed simulated data and propose novel approaches for performance evaluation. Our models significantly improve over current approach with respect to pathway ranking and the computations of relative abundance of pathways in environments.

  1. Analysis of Task-based Syllabus

    Institute of Scientific and Technical Information of China (English)

    马进胜

    2011-01-01

    Task-based language teaching is very popular in the modem English teaching.It is based on the Task-based Syllabus.Taskbased Syllabus focuses on the learners' communicative competence,which stresses learning by doing.From the theoretical assumption and definitions of the task,the paper analysizes the components of the task,then points out the merits and demerits of the syllabus.By this means the paper may give some tips to teachers and students when they use the tsk-based language teaching.

  2. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  3. Model-free Adaptive Variable Pitch Control for Wind Turbine System Constant Power%风力发电机组恒功率无模型自适应变桨距控制

    Institute of Scientific and Technical Information of China (English)

    姜萍; 赵振家

    2015-01-01

    风力发电系统是一个非线性、强耦合的系统,其风速的扰动性极强。无模型控制基于系统特性值动态估计,优化内部参数可使系统克服时变和非线性,具有比PID更好的抗干扰能力,且更适用于非线性、强耦合系统。最后,对比了在阶跃和随机风速下无模型与PID控制器的控制效果,并进行仿真研究,结果表明无模型控制能更好地保持发电机功率恒定。%Wind turbine system boasts strong nonlinearity and coupling,and the disturbance from the wind is very strong.The model free adaptive control based on dynamically estimating the system’ s characteristic value was proposed to overcome the system’ s time-varying and nonlinearity by optimizing internal parameters.Its ca-pacity of resisting disturbance outperforms that of PID and is more suitable for the nonlinear and strong cou-pling system.Comparing the control effect of the model-free adaptive pitch control with that of the PID control and then simulating it indicate that the model-free control can keep a constant power for the turbine generator better.

  4. Prostate cancer detection from model-free T1-weighted time series and diffusion imaging

    Science.gov (United States)

    Haq, Nandinee F.; Kozlowski, Piotr; Jones, Edward C.; Chang, Silvia D.; Goldenberg, S. Larry; Moradi, Mehdi

    2015-03-01

    The combination of Dynamic Contrast Enhanced (DCE) images with diffusion MRI has shown great potential in prostate cancer detection. The parameterization of DCE images to generate cancer markers is traditionally performed based on pharmacokinetic modeling. However, pharmacokinetic models make simplistic assumptions about the tissue perfusion process, require the knowledge of contrast agent concentration in a major artery, and the modeling process is sensitive to noise and fitting instabilities. We address this issue by extracting features directly from the DCE T1-weighted time course without modeling. In this work, we employed a set of data-driven features generated by mapping the DCE T1 time course to its principal component space, along with diffusion MRI features to detect prostate cancer. The optimal set of DCE features is extracted with sparse regularized regression through a Least Absolute Shrinkage and Selection Operator (LASSO) model. We show that when our proposed features are used within the multiparametric MRI protocol to replace the pharmacokinetic parameters, the area under ROC curve is 0.91 for peripheral zone classification and 0.87 for whole gland classification. We were able to correctly classify 32 out of 35 peripheral tumor areas identified in the data when the proposed features were used with support vector machine classification. The proposed feature set was used to generate cancer likelihood maps for the prostate gland.

  5. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Directory of Open Access Journals (Sweden)

    Elena Daskalaki

    Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.

  6. Model free audit methodology for bias evaluation of tumour progression in oncology.

    Science.gov (United States)

    Stone, Andrew; Macpherson, Euan; Smith, Ann; Jennison, Christopher

    2015-01-01

    Many oncology studies incorporate a blinded independent central review (BICR) to make an assessment of the integrity of the primary endpoint, progression free survival. Recently, it has been suggested that, in order to assess the potential for bias amongst investigators, a BICR amongst only a sample of patients could be performed; if evidence of bias is detected, according to a predefined threshold, the BICR is then assessed in all patients, otherwise, it is concluded that the sample was sufficient to rule out meaningful levels of bias. In this paper, we present an approach that adapts a method originally created for defining futility bounds in group sequential designs. The hazard ratio ratio, the ratio of the hazard ratio (HR) for the treatment effect estimated from the BICR to the corresponding HR for the investigator assessments, is used as the metric to define bias. The approach is simple to implement and ensures a high probability that a substantial true bias will be detected. In the absence of bias, there is a high probability of accepting the accuracy of local evaluations based on the sample, in which case an expensive BICR of all patients is avoided. The properties of the approach are demonstrated by retrospective application to a completed Phase III trial in colorectal cancer. The same approach could easily be adapted for other disease settings, and for test statistics other than the hazard ratio.

  7. FUZZY PRINCIPAL COMPONENT ANALYSIS AND ITS KERNEL BASED MODEL

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Principal Component Analysis (PCA) is one of the most important feature extraction methods, and Kernel Principal Component Analysis (KPCA) is a nonlinear extension of PCA based on kernel methods. In real world, each input data may not be fully assigned to one class and it may partially belong to other classes. Based on the theory of fuzzy sets, this paper presents Fuzzy Principal Component Analysis (FPCA) and its nonlinear extension model, i.e., Kernel-based Fuzzy Principal Component Analysis (KFPCA). The experimental results indicate that the proposed algorithms have good performances.

  8. Transect based analysis versus area based analysis to quantify shoreline displacement: spatial resolution issues.

    Science.gov (United States)

    Anfuso, Giorgio; Bowman, Dan; Danese, Chiara; Pranzini, Enzo

    2016-10-01

    Field surveys, aerial photographs, and satellite images are the most commonly employed sources of data to analyze shoreline position, which are further compared by area based analysis (ABA) or transect based analysis (TBA) methods. The former is performed by computing the mean shoreline displacement for the identified coastal segments, i.e., dividing the beach area variation by the segment length; the latter is based on the measurement of the distance between each shoreline at set points along transects. The present study compares, by means of GIS tools, the ABA and TBA methods by computing shoreline displacements recorded on two stretches of the Tuscany coast (Italy): the beaches of Punta Ala, a linear coast without shore protection structures, and the one at Follonica, which is irregular due to the presence of groins and detached breakwaters. Surveys were carried out using a differential global positioning system (DGPS) in RTK mode. For each site, a 4800-m-long coastal segment was analyzed and divided into ninety-six 50-m-long sectors for which changes were computed using both the ABA and TBA methods. Sectors were progressively joined to have a length of 100, 200, 400, and 800 m to examine how this influenced results. ABA and TBA results are highly correlated for transect distance and sector length up to 100 m at both investigated locations. If longer transects are considered, the two methods still produce good correlated data on the smooth shoreline (i.e. at Punta Ala), but correlation became significantly lower on the irregular shoreline (i.e., at Follonica).

  9. Description-based and experience-based decisions: individual analysis

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2012-05-01

    Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.

  10. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    1994-01-01

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomia...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits.......Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...

  11. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  12. Pathway-based Analysis Tools for Complex Diseases: A Review

    Directory of Open Access Journals (Sweden)

    Lv Jin

    2014-10-01

    Full Text Available Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods—the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases.

  13. Improving Cluster Analysis with Automatic Variable Selection Based on Trees

    Science.gov (United States)

    2014-12-01

    ANALYSIS WITH AUTOMATIC VARIABLE SELECTION BASED ON TREES by Anton D. Orr December 2014 Thesis Advisor: Samuel E. Buttrey Second Reader...DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE IMPROVING CLUSTER ANALYSIS WITH AUTOMATIC VARIABLE SELECTION BASED ON TREES 5. FUNDING NUMBERS 6...2006 based on classification and regression trees to address problems with determining dissimilarity. Current algorithms do not simultaneously address

  14. Encounter-based worms: Analysis and Defense

    CERN Document Server

    Tanachaiwiwat, Sapon

    2007-01-01

    Encounter-based network is a frequently-disconnected wireless ad-hoc network requiring immediate neighbors to store and forward aggregated data for information disseminations. Using traditional approaches such as gateways or firewalls for deterring worm propagation in encounter-based networks is inappropriate. We propose the worm interaction approach that relies upon automated beneficial worm generation aiming to alleviate problems of worm propagations in such networks. To understand the dynamic of worm interactions and its performance, we mathematically model worm interactions based on major worm interaction factors including worm interaction types, network characteristics, and node characteristics using ordinary differential equations and analyze their effects on our proposed metrics. We validate our proposed model using extensive synthetic and trace-driven simulations. We find that, all worm interaction factors significantly affect the pattern of worm propagations. For example, immunization linearly decrea...

  15. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...... designs. Our framework enables evaluating the risks of an insider attack to happen quantitatively. The framework first identifies an insider's intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using...... probabilistic model checking. We provide prototype tool support using Matlab for Bayesian networks and PRISM for the analysis of Markov decision processes, and validate the framework with case studies....

  16. Structural Analysis of Plate Based Tensegrity Structures

    DEFF Research Database (Denmark)

    Hald, Frederik; Kirkegaard, Poul Henning; Damkilde, Lars

    2013-01-01

    Plate tensegrity structures combine tension cables with a cross laminated timber plate and can then form e.g. a roof structure. The topology of plate tensegrity structures is investigated through a parametric investigation. Plate tensegrity structures are investigated, and a method...... for determination of the structures pre-stresses is used. A parametric investigation is performed to determine a more optimized form of the plate based tensegrity structure. Conclusions of the use of plate based tensegrity in civil engineering and further research areas are discussed....

  17. Symbolic Analysis of OTRAs-Based Circuits

    Directory of Open Access Journals (Sweden)

    C. Sánchez-López

    2011-04-01

    Full Text Available A new nullor-based model to describe the behavior of Operational Transresistance Amplifiers (OTRAs is introduced.The new model is composed of four nullors and three grounded resistors. As a consequence, standard nodal analysiscan be applied to compute fully-symbolic small-signal characteristics of OTRA-based analog circuits, and the nullorbasedOTRAs model can be used in CAD tools. In this manner, the fully-symbolic transfer functions of severalapplication circuits, such as filters and oscillators can easily be approximated.

  18. Crime prevention: more evidence-based analysis.

    Science.gov (United States)

    Garrido Genovés, Vicente; Farrington, David P; Welsh, Brandon C

    2008-02-01

    This paper introduces a new section of Psicothema dedicated to the evidence-based approach to crime prevention. Along with an original sexual-offender-treatment programme implemented in Spain, this section presents four systematic reviews of important subjects in the criminological arena, such as sexual offender treatment, the well-known programme, the effectiveness of custodial versus non-custodial sanctions in reoffending and the fight against terrorism. We also highlight some of the focal points that scientists, practitioners and governments should take into account in order to support this evidence-based viewpoint of crime prevention.

  19. Thanatophoric dysplasia: case-based bioethical analysis

    Directory of Open Access Journals (Sweden)

    Edgar Abarca López

    2014-04-01

    Full Text Available This paper presents a case report of thanatophoric displasia diagnosed in the prenatal period using ultrasound standards. The course of the case pregnancy, birth process, and postnatal period is described. This report invites bioethical analysis using its principles, appealing to human dignity, diversity and otherness, particularly in the mother-child dyad and their family. An early diagnosis allows parental support as they face the course of this condition and its potentially fatal outcome.

  20. Movement Pattern Analysis Based on Sequence Signatures

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Chavoshi

    2015-09-01

    Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.

  1. 扰动抑制无模型自适应控制的鲁棒性分析%The robustness of model-free adaptive control with disturbance suppression

    Institute of Scientific and Technical Information of China (English)

    卜旭辉; 侯忠生; 金尚泰

    2011-01-01

    针对一类存在测量扰动的SISO非线性离散时间系统,研究无模型自适应控制算法的扰动作用问题.首先给出了系统输出误差与测量扰动的关系,分析了测量扰动对控制性能的影响.然后提出了一种带有滤波器的改进无模型自适应控制算法,该算法较常规无模型自适应控制算法可有效抑制测量扰动的作用.仿真结果验证了理论分析的正确性.%For a class of single-input single-output(SISO) nonlinear discrete-time systems, the measurement disturbance impacts to a model-free adaptive control are considered. The relationship between the output error and the measurement disturbance are given, and the influence of the measurement disturbance is also analyzed. Then, an improved model-free adaptive control algorithm with a filter is proposed, which suppresses the measurement disturbance effectively. The analysis is supported by simulations.

  2. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.

    2008-01-01

    This review with 239 references has as its aim to give the reader an introduction to the kinds of methods used for developing microchip based electrode systems as well as to cover the existing literature on electroanalytical systems where microchips play a crucial role for 'nondestructive...

  3. Analysis of Cloud-Based Database Systems

    Science.gov (United States)

    2015-06-01

    University San Luis Obispo, 2009 Submitted in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE IN COMPUTER SCIENCE...for a query to complete on average for the production system was 136,746 xvi microseconds. On our cloud-based system, the average was 198,875

  4. Web-Based Statistical Sampling and Analysis

    Science.gov (United States)

    Quinn, Anne; Larson, Karen

    2016-01-01

    Consistent with the Common Core State Standards for Mathematics (CCSSI 2010), the authors write that they have asked students to do statistics projects with real data. To obtain real data, their students use the free Web-based app, Census at School, created by the American Statistical Association (ASA) to help promote civic awareness among school…

  5. 带死区的无模型自适应控制%Model free adaptive control for dead zone

    Institute of Scientific and Technical Information of China (English)

    顾志国; 孙阳阳; 钟祎勍; 姚国鹏

    2015-01-01

    针对被控对象的大惯性、大迟延、非线性等特性,基于PID控制抗干扰性强和无模型自适应控制(MFAC)对惯性和延迟系统具有较强适应能力的特点,设计了一种带死区的 MFAC-PID串级控制系统。以米东热电厂选择性非催化还原(SNCR)烟气脱硝系统为对象,建立了系统仿真模型,采用MATLAB软件将MFAC-PID与PID控制进行仿真比较。结果表明:在阶跃扰动和模型失配情况下,带死区的MFAC-PID串级控制系统较常规PID串级控制系统,超调更小,调节时间更短;在机组负荷扰动下,带死区的 MFAC-PID 串级控制系统波动很小。因此,带死区的MFAC-PID串级控制系统能够更好地适应对象参数的变化,具有较强的鲁棒性和自适应性。%According to such characteristics of the controlled obj ect as large inertia,large delay and nonlin-earity,a model free adaptive control and PED (MFAC-PID)cascade control system with dead zone was de-signed,on the basis of anti interference of PID control and strong adaptability of model free adaptive con-trol (MFAC)to inertia and delay system.Moreover,taking the selective non-catalytic reduction (SNCR) flue gas denitration system in Midong Thermal Power Plant as the obj ect,the system simulation model was established,and the MATLAB software was applied to compare the two control methods.The results show that,under conditions with step disturbance and model mismatch,the MFAC-PID cascade control system with dead zone has smaller overshoot and shorter settling time than the conventional PID cascade control system.Under unit load disturbance condition,the MFAC-PID cascade control system with dead zone has very small fluctuation.Therefore,the MFAC-PID cascade control system with dead zone can better adapt to the changes of obj ect parameters,and has strong robustness and adaptability.

  6. An SQL-based approach to physics analysis

    Science.gov (United States)

    Limper, Maaike, Dr

    2014-06-01

    As part of the CERN openlab collaboration a study was made into the possibility of performing analysis of the data collected by the experiments at the Large Hadron Collider (LHC) through SQL-queries on data stored in a relational database. Currently LHC physics analysis is done using data stored in centrally produced "ROOT-ntuple" files that are distributed through the LHC computing grid. The SQL-based approach to LHC physics analysis presented in this paper allows calculations in the analysis to be done at the database and can make use of the database's in-built parallelism features. Using this approach it was possible to reproduce results for several physics analysis benchmarks. The study shows the capability of the database to handle complex analysis tasks but also illustrates the limits of using row-based storage for storing physics analysis data, as performance was limited by the I/O read speed of the system.

  7. Desiccant-Based Preconditioning Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    A number of important conclusions can be drawn as a result of this broad, first-phase market evaluation. The more important conclusions include the following: (1) A very significant market opportunity will exist for specialized outdoor air-handling units (SOAHUs) as more construction and renovation projects are designed to incorporate the recommendations made by the ASHRAE 62-1989 standard. Based on this investigation, the total potential market is currently $725,000,000 annually (see Table 6, Sect. 3). Based on the market evaluations completed, it is estimated that approximately $398,000,000 (55%) of this total market could be served by DBC systems if they were made cost-effective through mass production. Approximately $306,000,000 (42%) of the total can be served by a non-regenerated, desiccant-based total recovery approach, based on the information provided by this investigation. Approximately $92,000,000 (13%) can be served by a regenerated desiccant-based cooling approach (see Table 7, Sect. 3). (2) A projection of the market selling price of various desiccant-based SOAHU systems was prepared using prices provided by Trane for central-station, air-handling modules currently manufactured. The wheel-component pricing was added to these components by SEMCO. This resulted in projected pricing for these systems that is significantly less than that currently offered by custom suppliers (see Table 4, Sect. 2). Estimated payback periods for all SOAHU approaches were quite short when compared with conventional over-cooling and reheat systems. Actual paybacks may vary significantly depending on site-specific considerations. (3) In comparing cost vs benefit of each SOAHU approach, it is critical that the total system design be evaluated. For example, the cost premium of a DBC system is very significant when compared to a conventional air handling system, yet the reduced chiller, boiler, cooling tower, and other expense often equals or exceeds this premium, resulting in a

  8. Google glass based immunochromatographic diagnostic test analysis

    Science.gov (United States)

    Feng, Steve; Caire, Romain; Cortazar, Bingen; Turan, Mehmet; Wong, Andrew; Ozcan, Aydogan

    2015-03-01

    Integration of optical imagers and sensors into recently emerging wearable computational devices allows for simpler and more intuitive methods of integrating biomedical imaging and medical diagnostics tasks into existing infrastructures. Here we demonstrate the ability of one such device, the Google Glass, to perform qualitative and quantitative analysis of immunochromatographic rapid diagnostic tests (RDTs) using a voice-commandable hands-free software-only interface, as an alternative to larger and more bulky desktop or handheld units. Using the built-in camera of Glass to image one or more RDTs (labeled with Quick Response (QR) codes), our Glass software application uploads the captured image and related information (e.g., user name, GPS, etc.) to our servers for remote analysis and storage. After digital analysis of the RDT images, the results are transmitted back to the originating Glass device, and made available through a website in geospatial and tabular representations. We tested this system on qualitative human immunodeficiency virus (HIV) and quantitative prostate-specific antigen (PSA) RDTs. For qualitative HIV tests, we demonstrate successful detection and labeling (i.e., yes/no decisions) for up to 6-fold dilution of HIV samples. For quantitative measurements, we activated and imaged PSA concentrations ranging from 0 to 200 ng/mL and generated calibration curves relating the RDT line intensity values to PSA concentration. By providing automated digitization of both qualitative and quantitative test results, this wearable colorimetric diagnostic test reader platform on Google Glass can reduce operator errors caused by poor training, provide real-time spatiotemporal mapping of test results, and assist with remote monitoring of various biomedical conditions.

  9. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  10. Discontinuous deformation analysis based on complementary theory

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The contact between blocks is treated by the open-close iteration in the conventional discontinuous deformation analysis (DDA),which needs to introduce spurious springs between two blocks in contact and to assume the normal stiffness and the tangential stiffness (the penalty factors). Unreasonable values of stiffness would result in numerical problems. To avoid the penalty factors and the open-close iteration,we reformulate the DDA as a mixed complementary problem (MiCP) and then choose the path Newton method (PNM) to solve the problem. Some examples including those originally designed by Shi are reanalyzed,which proves feasibility of the proposed procedure.

  11. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  12. Dynamic Analysis of Multilayers Based MEMS Resonators

    Directory of Open Access Journals (Sweden)

    Hassen M. Ouakad

    2017-01-01

    Full Text Available The dynamic behavior of a microelectromechanical system (MEMS parallel and electrically coupled double-layers (microbeams based resonator is investigated. Two numerical methods were used to solve the dynamical problem: the reduced-order modeling (ROM and the perturbation method. The ROM was derived using the so-called Galerkin expansion with considering the linear undamped mode shapes of straight beam as the basis functions. The perturbation method was generated using the method of multiple scales by direct attack of the equations of motion. Dynamic analyses, assuming the above two numerical methods were performed, and a comparison of the results showed good agreement. Finally, a parametric study was performed using the perturbation on different parameters and the results revealed different interesting features, which hopefully can be useful for some MEMS based applications.

  13. Nondestructive Damage Detection Based on Modal Analysis

    Directory of Open Access Journals (Sweden)

    T. Plachý

    2004-01-01

    Full Text Available Three studies of damage identification and localization based on methods using experimentally estimated modal characteristics are presented. The results of an experimental investigation of simple structural elements (three RC-beams and three RC-slabs obtained in the laboratory are compared with the results obtained on a real structure (a composite bridge – a concrete deck supported by steel girders in situ. 

  14. Risk-Based Explosive Safety Analysis

    Science.gov (United States)

    2016-11-30

    Risk-based analyses can also be used for risk management purposes and comparative studies when evaluating test programs that utilize energetic...liquids or propellants. 15. SUBJECT TERMS N/A 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF...analyses can also be used for risk management purposes and comparative studies when evaluating test programs that utilize energetic liquids or

  15. Model-based methods for linkage analysis.

    Science.gov (United States)

    Rice, John P; Saccone, Nancy L; Corbett, Jonathan

    2008-01-01

    The logarithm of an odds ratio (LOD) score method originated in a seminal article by Newton Morton in 1955. The method is broadly concerned with issues of power and the posterior probability of linkage, ensuring that a reported linkage has a high probability of being a true linkage. In addition, the method is sequential so that pedigrees or LOD curves may be combined from published reports to pool data for analysis. This approach has been remarkably successful for 50 years in identifying disease genes for Mendelian disorders. After discussing these issues, we consider the situation for complex disorders where the maximum LOD score statistic shares some of the advantages of the traditional LOD score approach, but is limited by unknown power and the lack of sharing of the primary data needed to optimally combine analytic results. We may still learn from the LOD score method as we explore new methods in molecular biology and genetic analysis to utilize the complete human DNA sequence and the cataloging of all human genes.

  16. 紧格式无模型自适应控制在直线电机中的仿真%Simulation of Model-free Adaptive Control with Strict Format in Linear Motor

    Institute of Scientific and Technical Information of China (English)

    李萍; 赵铭扬

    2014-01-01

    The model-free adaptive control (MFAC)approach of nonlinear systems based on linearization of strict format is applied to motion control of the linear motor.The strict format dynamic linear time-varying model is used to replace nonlinear system model of linear motor.Pseudo partial derivative is estimated online by the input and output data of motion model of linear motor.The simulation results show that tight format model-free controller has a strong adaptive,anti-interference,stable and robustness for motor with vaguely known dynamic nonlinear systems.%将基于紧格式线性化的非线性系统无模型自适应控制方法应用于直线电机的运动控制中。用紧格式动态线性时变模型替代直线电机非线性系统模型,根据直线电机运动模型的输入输出数据在线估计系统的伪偏导数。仿真实验表明,紧格式无模型自适应控制方法对电机这种具有不确定动态的非线性系统有较强的自适应性、抗干扰性、稳定性和鲁棒性。

  17. Decadal shifts of East Asian summer monsoon in a climate model free of explicit GHGs and aerosols

    Science.gov (United States)

    Lin, Renping; Zhu, Jiang; Zheng, Fei

    2016-12-01

    The East Asian summer monsoon (EASM) experienced decadal transitions over the past few decades, and the associated "wetter-South-drier-North" shifts in rainfall patterns in China significantly affected the social and economic development in China. Two viewpoints stand out to explain these decadal shifts, regarding the shifts either a result of internal variability of climate system or that of external forcings (e.g. greenhouse gases (GHGs) and anthropogenic aerosols). However, most climate models, for example, the Atmospheric Model Intercomparison Project (AMIP)-type simulations and the Coupled Model Intercomparison Project (CMIP)-type simulations, fail to simulate the variation patterns, leaving the mechanisms responsible for these shifts still open to dispute. In this study, we conducted a successful simulation of these decadal transitions in a coupled model where we applied ocean data assimilation in the model free of explicit aerosols and GHGs forcing. The associated decadal shifts of the three-dimensional spatial structure in the 1990s, including the eastward retreat, the northward shift of the western Pacific subtropical high (WPSH), and the south-cool-north-warm pattern of the upper-level tropospheric temperature, were all well captured. Our simulation supports the argument that the variations of the oceanic fields are the dominant factor responsible for the EASM decadal transitions.

  18. Face Recognition Based on Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Ali Javed

    2013-02-01

    Full Text Available The purpose of the proposed research work is to develop a computer system that can recognize a person by comparing the characteristics of face to those of known individuals. The main focus is on frontal two dimensional images that are taken in a controlled environment i.e. the illumination and the background will be constant. All the other methods of person’s identification and verification like iris scan or finger print scan require high quality and costly equipment’s but in face recognition we only require a normal camera giving us a 2-D frontal image of the person that will be used for the process of the person’s recognition. Principal Component Analysis technique has been used in the proposed system of face recognition. The purpose is to compare the results of the technique under the different conditions and to find the most efficient approach for developing a facial recognition system

  19. Mental EEG Analysis Based on Infomax Algorithm

    Institute of Scientific and Technical Information of China (English)

    WUXiao-pei; GuoXiao-jing; ZANGDao-xin; SHENQian

    2004-01-01

    The patterns of EEG will change with mental tasks performed by the subject. In the field of EEG signal analysis and application, the study to get the patterns of mental EEG and then to use them to classify mental tasks has the significant scientific meaning and great application value. But for the reasons of different artifacts existing in EEG, the pattern detection of EEG under normal mental states is a very difficult problem. In this paper, Independent Component Analysisis applied to EEG signals collected from performing different mental tasks. The experiment results show that when one subject performs a single mental task in different trials, the independent components of EEG are very similar. It means that the independent components can be used as the mental EEG patterns to classify the different mental tasks.

  20. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  1. Wavelet Based Fractal Analysis of Airborne Pollen

    CERN Document Server

    Degaudenzi, M E

    1999-01-01

    The most abundant biological particles in the atmosphere are pollen grains and spores. Self protection of pollen allergy is possible through the information of future pollen contents in the air. In spite of the importance of airborne pol len concentration forecasting, it has not been possible to predict the pollen concentrations with great accuracy, and about 25% of the daily pollen forecasts have resulted in failures. Previous analysis of the dynamic characteristics of atmospheric pollen time series indicate that the system can be described by a low dimensional chaotic map. We apply the wavelet transform to study the multifractal characteristics of an a irborne pollen time series. We find the persistence behaviour associated to low pollen concentration values and to the most rare events of highest pollen co ncentration values. The information and the correlation dimensions correspond to a chaotic system showing loss of information with time evolution.

  2. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  3. LEARNING DIFFICULTIES: AN ANALYSIS BASED ON VIGOTSKY

    Directory of Open Access Journals (Sweden)

    Adriane Cenci

    2010-06-01

    Full Text Available We aimed, along the text, to bring a reflection upon learning difficulties based on Socio-Historical Theory, relating what is observed in schools to what has been discussed about learning difficulties and the theory proposed by Vygotsky in the early XX century. We understand that children enter school carrying experiences and knowledge from their cultural group and that school ignores such knowledge very often. Then, it is in such disengagement that emerges what we started to call learning difficulties. One cannot forget to see a child as a whole – a student is a social being constituted by culture, language and specific values to which one must be attentive.

  4. Building Extraction from LIDAR Based Semantic Analysis

    Institute of Scientific and Technical Information of China (English)

    YU Jie; YANG Haiquan; TAN Ming; ZHANG Guoning

    2006-01-01

    Extraction of buildings from LIDAR data has been an active research field in recent years. A scheme for building detection and reconstruction from LIDAR data is presented with an object-oriented method which is based on the buildings' semantic rules. Two key steps are discussed: how to group the discrete LIDAR points into single objects and how to establish the buildings' semantic rules. In the end, the buildings are reconstructed in 3D form and three common parametric building models (flat, gabled, hipped) are implemented.

  5. Advanced overlay analysis through design based metrology

    Science.gov (United States)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  6. The Route Analysis Based On Flight Plan

    Science.gov (United States)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  7. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  8. Choice-Based Conjoint Analysis: Classification vs. Discrete Choice Models

    Science.gov (United States)

    Giesen, Joachim; Mueller, Klaus; Taneva, Bilyana; Zolliker, Peter

    Conjoint analysis is a family of techniques that originated in psychology and later became popular in market research. The main objective of conjoint analysis is to measure an individual's or a population's preferences on a class of options that can be described by parameters and their levels. We consider preference data obtained in choice-based conjoint analysis studies, where one observes test persons' choices on small subsets of the options. There are many ways to analyze choice-based conjoint analysis data. Here we discuss the intuition behind a classification based approach, and compare this approach to one based on statistical assumptions (discrete choice models) and to a regression approach. Our comparison on real and synthetic data indicates that the classification approach outperforms the discrete choice models.

  9. Adding trend data to Depletion-Based Stock Reduction Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A Bayesian model of Depletion-Based Stock Reduction Analysis (DB-SRA), informed by a time series of abundance indexes, was developed, using the Sampling Importance...

  10. Method for detecting software anomalies based on recurrence plot analysis

    OpenAIRE

    Michał Mosdorf

    2012-01-01

    Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET). Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions).

  11. Method for detecting software anomalies based on recurrence plot analysis

    Directory of Open Access Journals (Sweden)

    Michał Mosdorf

    2012-03-01

    Full Text Available Presented paper evaluates method for detecting software anomalies based on recurrence plot analysis of trace log generated by software execution. Described method for detecting software anomalies is based on windowed recurrence quantification analysis for selected measures (e.g. Recurrence rate - RR or Determinism - DET. Initial results show that proposed method is useful in detecting silent software anomalies that do not result in typical crashes (e.g. exceptions.

  12. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...... analysis based on tree{structured graphical models is a simple nonlinear method competitive with, and sometimes superior to, other well{known linear methods like those assuming mutual independence between variables and linear logistic regression....

  13. Sentiment Analysis of Document Based on Annotation

    CERN Document Server

    Shukla, Archana

    2011-01-01

    I present a tool which tells the quality of document or its usefulness based on annotations. Annotation may include comments, notes, observation, highlights, underline, explanation, question or help etc. comments are used for evaluative purpose while others are used for summarization or for expansion also. Further these comments may be on another annotation. Such annotations are referred as meta-annotation. All annotation may not get equal weightage. My tool considered highlights, underline as well as comments to infer the collective sentiment of annotators. Collective sentiments of annotators are classified as positive, negative, objectivity. My tool computes collective sentiment of annotations in two manners. It counts all the annotation present on the documents as well as it also computes sentiment scores of all annotation which includes comments to obtain the collective sentiments about the document or to judge the quality of document. I demonstrate the use of tool on research paper.

  14. Analysis of Vehicle-Based Security Operations

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jason M [ORNL; Paul, Nate R [ORNL

    2015-01-01

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be

  15. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  16. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  17. PYTHON-based Physics Analysis Environment for LHCb

    CERN Document Server

    Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E

    2004-01-01

    BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.

  18. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.; Queiroz Feitosa, R.; van der Meer, F.D.; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extr

  19. Web-Based Trainer for Electrical Circuit Analysis

    Science.gov (United States)

    Weyten, L.; Rombouts, P.; De Maeyer, J.

    2009-01-01

    A Web-based system for training electric circuit analysis is presented in this paper. It is centered on symbolic analysis techniques and it not only verifies the student's final answer, but it also tracks and coaches him/her through all steps of his/her reasoning path. The system mimics homework assignments, enhanced by immediate personalized…

  20. 潜油电机建模与无模型自适应控制应用%Electric Submersible Motor Modeling and Model Free Adaptive Control Applications

    Institute of Scientific and Technical Information of China (English)

    王通; 赖浩喆; 高宪文

    2015-01-01

    Electric submersible motor system based on the open-loop V/F control method is widely used. With the transformation of traditional mining methods to the refinement of production, the existing production systems need to be upgraded for the new manufacturing requirement. Motor model of rotor field-oriented is proposed and combined with the model free adaptive control method. With this method, the submersible motor speed control system achieves high performance and reduces the impact of stator resistance varying with temperature. The experiment results show that the closed-loop system can be controlled steadily and speed-regulating performances can be greatly improved.%潜油电机采油系统控制方式多采用变压变频开环控制。但随着传统开采方式向精细化生产转型,需要进行升级改造以满足新的技术要求。建立了潜油电机基于恒转子磁场控制的系统数学模型,并采用无模型自适应控制方法,构建闭环控制系统,实现潜油电机恒压频比调速系统的高性能控制。使在不更换现有设备的前提下,减小电机定转子电阻因温度变化对系统性能的影响。实验结果证明了无模型控制方法的可行性并具有优良的控制效果。

  1. Surveillance data bases, analysis, and standardization program

    Energy Technology Data Exchange (ETDEWEB)

    Kam, F.B.K.

    1990-09-26

    The traveler presented a paper at the Seventh ASTM-EURATOM Symposium on Reactor Dosimetry and co-chaired an oral session on Computer Codes and Methods. Papers of considerable interest to the NRC Surveillance Dosimetry Program involved statistically based adjustment procedures and uncertainties. The information exchange meetings with Czechoslovakia and Hungary were very enlightening. Lack of large computers have hindered their surveillance program. They depended very highly on information from their measurement programs which were somewhat limited because of the lack of sophisticated electronics. The Nuclear Research Institute at Rez had to rely on expensive mockups of power reactor configurations to test their fluence exposures. Computers, computer codes, and updated nuclear data would advance their technology rapidly, and they were not hesitant to admit this fact. Both eastern-bloc countries said that IBM is providing an IBM 3090 for educational purposes but research and development studies would have very limited access. They were very apologetic that their currencies were not convertible, and any exchange means that they could provide services or pay for US scientists in their respective countries, but funding for their scientists in the United States, or expenses that involved payment in dollars, must come from us.

  2. Preprocessing and Analysis of LC-MS-Based Proteomic Data.

    Science.gov (United States)

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed.

  3. An Evidence-Based Videotaped Running Biomechanics Analysis.

    Science.gov (United States)

    Souza, Richard B

    2016-02-01

    Running biomechanics play an important role in the development of injuries. Performing a running biomechanics analysis on injured runners can help to develop treatment strategies. This article provides a framework for a systematic video-based running biomechanics analysis plan based on the current evidence on running injuries, using 2-dimensional (2D) video and readily available tools. Fourteen measurements are proposed in this analysis plan from lateral and posterior video. Identifying simple 2D surrogates for 3D biomechanic variables of interest allows for widespread translation of best practices, and have the best opportunity to impact the highly prevalent problem of the injured runner.

  4. A Contrastive Analysis on Web-based Intercultural Peer Feedback

    Directory of Open Access Journals (Sweden)

    Qin Wang

    2013-11-01

    Full Text Available This paper made a contrastive analysis on peer feedback generated by Chinese EFL learners and Native Speakers of English, who participated in a web-based Cross-Pacific Writing Exchange program. The analysis mainly focused on differences in terms of commenting size, nature and function; the pragmatic differences between two groups of learners were investigated as well. The present study afforded us lessons for peer review training program and provided pedagogical implications for L2 writing. Keywords: peer feedback; web-based; intercultural; contrastive analysis

  5. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard;

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system. A comp...

  6. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research

    Directory of Open Access Journals (Sweden)

    Junyi Li

    2016-01-01

    Full Text Available With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA based on gene coexpression network (GCN increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies.

  7. Vehicle passes detector based on multi-sensor analysis

    Science.gov (United States)

    Bocharov, D.; Sidorchuk, D.; Konovalenko, I.; Koptelov, I.

    2015-02-01

    The study concerned deals with a new approach to the problem of detecting vehicle passes in vision-based automatic vehicle classification system. Essential non-affinity image variations and signals from induction loop are the events that can be considered as detectors of an object presence. We propose several vehicle detection techniques based on image processing and induction loop signal analysis. Also we suggest a combined method based on multi-sensor analysis to improve vehicle detection performance. Experimental results in complex outdoor environments show that the proposed multi-sensor algorithm is effective for vehicles detection.

  8. Online Fault Diagnosis Method Based on Nonlinear Spectral Analysis

    Institute of Scientific and Technical Information of China (English)

    WEI Rui-xuan; WU Li-xun; WANG Yong-chang; HAN Chong-zhao

    2005-01-01

    The fault diagnosis based on nonlinear spectral analysis is a new technique for the nonlinear fault diagnosis, but its online application could be limited because of the enormous compution requirements for the estimation of general frequency response functions. Based on the fully decoupled Volterra identification algorithm, a new online fault diagnosis method based on nonlinear spectral analysis is presented, which can availably reduce the online compution requirements of general frequency response functions. The composition and working principle of the method are described, the test experiments have been done for damping spring of a vehicle suspension system by utilizing the new method, and the results indicate that the method is efficient.

  9. Kernel-Based Nonlinear Discriminant Analysis for Face Recognition

    Institute of Scientific and Technical Information of China (English)

    LIU QingShan (刘青山); HUANG Rui (黄锐); LU HanQing (卢汉清); MA SongDe (马颂德)

    2003-01-01

    Linear subspace analysis methods have been successfully applied to extract features for face recognition. But they are inadequate to represent the complex and nonlinear variations of real face images, such as illumination, facial expression and pose variations, because of their linear properties. In this paper, a nonlinear subspace analysis method, Kernel-based Nonlinear Discriminant Analysis (KNDA), is presented for face recognition, which combines the nonlinear kernel trick with the linear subspace analysis method - Fisher Linear Discriminant Analysis (FLDA).First, the kernel trick is used to project the input data into an implicit feature space, then FLDA is performed in this feature space. Thus nonlinear discriminant features of the input data are yielded. In addition, in order to reduce the computational complexity, a geometry-based feature vectors selection scheme is adopted. Another similar nonlinear subspace analysis is Kernel-based Principal Component Analysis (KPCA), which combines the kernel trick with linear Principal Component Analysis (PCA). Experiments are performed with the polynomial kernel, and KNDA is compared with KPCA and FLDA. Extensive experimental results show that KNDA can give a higher recognition rate than KPCA and FLDA.

  10. Spectral Efficiency Analysis for Multicarrier Based 4G Systems

    DEFF Research Database (Denmark)

    Silva, Nuno; Rahman, Muhammad Imadur; Frederiksen, Flemming Bjerge;

    2006-01-01

    In this paper, a spectral efficiency definition is proposed. Spectral efficiency for multicarrier based multiaccess techniques, such as OFDMA, MC-CDMA and OFDMA-CDM, is analyzed. Simulations for different indoor and outdoor scenarios are carried out. Based on the simulations, we have discussed ho...... different wireless channel’s condition affects the performance of a system in terms of spectral efficiency. Based on our analysis, we have also recommended different access techniques for different scenarios....

  11. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    OpenAIRE

    T. Subramani; J. Jothi

    2014-01-01

    This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case s...

  12. Abnormal traffic flow data detection based on wavelet analysis

    Directory of Open Access Journals (Sweden)

    Xiao Qian

    2016-01-01

    Full Text Available In view of the traffic flow data of non-stationary, the abnormal data detection is difficult.proposed basing on the wavelet analysis and least squares method of abnormal traffic flow data detection in this paper.First using wavelet analysis to make the traffic flow data of high frequency and low frequency component and separation, and then, combined with least square method to find abnormal points in the reconstructed signal data.Wavelet analysis and least square method, the simulation results show that using wavelet analysis of abnormal traffic flow data detection, effectively reduce the detection results of misjudgment rate and false negative rate.

  13. Adaptive Fourier Decomposition Based Time-Frequency Analysis

    Institute of Scientific and Technical Information of China (English)

    Li-Ming Zhang

    2014-01-01

    The attempt to represent a signal simultaneously in time and frequency domains is full of challenges. The recently proposed adaptive Fourier decomposition (AFD) offers a practical approach to solve this problem. This paper presents the principles of the AFD based time-frequency analysis in three aspects: instantaneous frequency analysis, frequency spectrum analysis, and the spectrogram analysis. An experiment is conducted and compared with the Fourier transform in convergence rate and short-time Fourier transform in time-frequency distribution. The proposed approach performs better than both the Fourier transform and short-time Fourier transform.

  14. Noise analysis for sensitivity-based structural damage detection

    Institute of Scientific and Technical Information of China (English)

    YIN Tao; ZHU Hong-ping; YU Ling

    2007-01-01

    As vibration-based structural damage detection methods are easily affected by environmental noise, a new statistic-based noise analysis method is proposed together with the Monte Carlo technique to investigate the influence of experimental noise of modal data on sensitivity-based damage detection methods. Different from the commonly used random perturbation technique, the proposed technique is deduced directly by Moore-Penrose generalized inverse of the sensitivity matrix, which does not only make the analysis process more efficient but also can analyze the influence of noise on both frequencies and mode shapes for three commonly used sensitivity-based damage detection methods in a similar way. A one-story portal frame is adopted to evaluate the efficiency of the proposed noise analysis technique.

  15. Analysis of security protocols based on challenge-response

    Institute of Scientific and Technical Information of China (English)

    LUO JunZhou; YANG Ming

    2007-01-01

    Security protocol is specified as the procedure of challenge-response, which uses applied cryptography to confirm the existence of other principals and fulfill some data negotiation such as session keys. Most of the existing analysis methods,which either adopt theorem proving techniques such as state exploration or logic reasoning techniques such as authentication logic, face the conflicts between analysis power and operability. To solve the problem, a new efficient method is proposed that provides SSM semantics-based definition of secrecy and authentication goals and applies authentication logic as fundamental analysis techniques,in which secrecy analysis is split into two parts: Explicit-Information-Leakage and Implicit-Information-Leakage, and correspondence analysis is concluded as the analysis of the existence relationship of Strands and the agreement of Strand parameters. This new method owns both the power of the Strand Space Model and concision of authentication logic.

  16. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  17. A Framework for Web-Based Mechanical Design and Analysis

    Institute of Scientific and Technical Information of China (English)

    Chiaming; Yen; Wujeng; Li

    2002-01-01

    In this paper, a Web-based Mechanical Design and A na lysis Framework (WMDAF) is proposed. This WMADF allows designers to develop web -based computer aided programs in a systematic way during the collaborative mec hanical system design and analysis process. This system is based on an emerg ing web-based Content Management System (CMS) called eXtended Object Oriented P ortal System (XOOPS). Due to the Open Source Status of the XOOPS CMS, programs d eveloped with this framework can be further customized to ...

  18. Tikhonov regularization-based operational transfer path analysis

    Science.gov (United States)

    Cheng, Wei; Lu, Yingying; Zhang, Zhousuo

    2016-06-01

    To overcome ill-posed problems in operational transfer path analysis (OTPA), and improve the stability of solutions, this paper proposes a novel OTPA based on Tikhonov regularization, which considers both fitting degrees and stability of solutions. Firstly, fundamental theory of Tikhonov regularization-based OTPA is presented, and comparative studies are provided to validate the effectiveness on ill-posed problems. Secondly, transfer path analysis and source contribution evaluations for numerical cases studies on spherical radiating acoustical sources are comparatively studied. Finally, transfer path analysis and source contribution evaluations for experimental case studies on a test bed with thin shell structures are provided. This study provides more accurate transfer path analysis for mechanical systems, which can benefit for vibration reduction by structural path optimization. Furthermore, with accurate evaluation of source contributions, vibration monitoring and control by active controlling vibration sources can be effectively carried out.

  19. Matrix-based introduction to multivariate data analysis

    CERN Document Server

    Adachi, Kohei

    2016-01-01

    This book enables readers who may not be familiar with matrices to understand a variety of multivariate analysis procedures in matrix forms. Another feature of the book is that it emphasizes what model underlies a procedure and what objective function is optimized for fitting the model to data. The author believes that the matrix-based learning of such models and objective functions is the fastest way to comprehend multivariate data analysis. The text is arranged so that readers can intuitively capture the purposes for which multivariate analysis procedures are utilized: plain explanations of the purposes with numerical examples precede mathematical descriptions in almost every chapter. This volume is appropriate for undergraduate students who already have studied introductory statistics. Graduate students and researchers who are not familiar with matrix-intensive formulations of multivariate data analysis will also find the book useful, as it is based on modern matrix formulations with a special emphasis on ...

  20. Data Warehouse Requirements Analysis Framework: Business-Object Based Approach

    Directory of Open Access Journals (Sweden)

    Anirban Sarkar

    2012-01-01

    Full Text Available Detailed requirements analysis plays a key role towards the design of successful Data Warehouse (DW system. The requirements analysis specifications are used as the prime input for the construction of conceptual level multidimensional data model. This paper has proposed a Business Object based requirements analysis framework for DW system which is supported with abstraction mechanism and reuse capability. It also facilitate the stepwise mapping of requirements descriptions into high level design components of graph semantic based conceptual level object oriented multidimensional data model. The proposed framework starts with the identification of the analytical requirements using business process driven approach and finally refine the requirements in further detail to map into the conceptual level DW design model using either Demand-driven of Mixed-driven approach for DW requirements analysi

  1. NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.

  2. Basic gait analysis based on continuous wave radar.

    Science.gov (United States)

    Zhang, Jun

    2012-09-01

    A gait analysis method based on continuous wave (CW) radar is proposed in this paper. Time-frequency analysis is used to analyze the radar micro-Doppler echo from walking humans, and the relationships between the time-frequency spectrogram and human biological gait are discussed. The methods for extracting the gait parameters from the spectrogram are studied in depth and experiments on more than twenty subjects have been performed to acquire the radar gait data. The gait parameters are calculated and compared. The gait difference between men and women are presented based on the experimental data and extracted features. Gait analysis based on CW radar will provide a new method for clinical diagnosis and therapy.

  3. Intelligent Hybrid Cluster Based Classification Algorithm for Social Network Analysis

    Directory of Open Access Journals (Sweden)

    S. Muthurajkumar

    2014-05-01

    Full Text Available In this paper, we propose an hybrid clustering based classification algorithm based on mean approach to effectively classify to mine the ordered sequences (paths from weblog data in order to perform social network analysis. In the system proposed in this work for social pattern analysis, the sequences of human activities are typically analyzed by switching behaviors, which are likely to produce overlapping clusters. In this proposed system, a robust Modified Boosting algorithm is proposed to hybrid clustering based classification for clustering the data. This work is useful to provide connection between the aggregated features from the network data and traditional indices used in social network analysis. Experimental results show that the proposed algorithm improves the decision results from data clustering when combined with the proposed classification algorithm and hence it is proved that of provides better classification accuracy when tested with Weblog dataset. In addition, this algorithm improves the predictive performance especially for multiclass datasets which can increases the accuracy.

  4. Study of engine noise based on independent component analysis

    Institute of Scientific and Technical Information of China (English)

    HAO Zhi-yong; JIN Yan; YANG Chen

    2007-01-01

    Independent component analysis was applied to analyze the acoustic signals from diesel engine. First the basic principle of independent component analysis (ICA) was reviewed. Diesel engine acoustic signal was decomposed into several independent components (Ics); Fourier transform and continuous wavelet transform (CWT) were applied to analyze the independent components. Different noise sources of the diesel engine were separated, based on the characteristics of different component in time-frequency domain.

  5. 13C-based metabolic flux analysis: fundamentals and practice.

    Science.gov (United States)

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  6. Rare variant detection using family-based sequencing analysis.

    Science.gov (United States)

    Peng, Gang; Fan, Yu; Palculict, Timothy B; Shen, Peidong; Ruteshouser, E Cristy; Chi, Aung-Kyaw; Davis, Ronald W; Huff, Vicki; Scharfe, Curt; Wang, Wenyi

    2013-03-05

    Next-generation sequencing is revolutionizing genomic analysis, but this analysis can be compromised by high rates of missing true variants. To develop a robust statistical method capable of identifying variants that would otherwise not be called, we conducted sequence data simulations and both whole-genome and targeted sequencing data analysis of 28 families. Our method (Family-Based Sequencing Program, FamSeq) integrates Mendelian transmission information and raw sequencing reads. Sequence analysis using FamSeq reduced the number of false negative variants by 14-33% as assessed by HapMap sample genotype confirmation. In a large family affected with Wilms tumor, 84% of variants uniquely identified by FamSeq were confirmed by Sanger sequencing. In children with early-onset neurodevelopmental disorders from 26 families, de novo variant calls in disease candidate genes were corrected by FamSeq as mendelian variants, and the number of uniquely identified variants in affected individuals increased proportionally as additional family members were included in the analysis. To gain insight into maximizing variant detection, we studied factors impacting actual improvements of family-based calling, including pedigree structure, allele frequency (common vs. rare variants), prior settings of minor allele frequency, sequence signal-to-noise ratio, and coverage depth (∼20× to >200×). These data will help guide the design, analysis, and interpretation of family-based sequencing studies to improve the ability to identify new disease-associated genes.

  7. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  8. Segmentation of Stick Text Based on Sub Connected Area Analysis

    Institute of Scientific and Technical Information of China (English)

    高静波; 李新友; 等

    1998-01-01

    A new stick text segmentation method based on the sub connected area analysis is introduced in this paper.The foundation of this method is the sub connected area representation of text image that can represent all connected areas in an image efficiently.This method consists mainly of four steps:sub connected area classification,finding initial boundary following point,finding optimal segmentation point by boundary tracing,and text segmentaton.This method is similar to boundary analysis method but is more efficient than boundary analysis.

  9. AN HMM BASED ANALYSIS FRAMEWORK FOR SEMANTIC VIDEO EVENTS

    Institute of Scientific and Technical Information of China (English)

    You Junyong; Liu Guizhong; Zhang Yaxin

    2007-01-01

    Semantic video analysis plays an important role in the field of machine intelligence and pattern recognition. In this paper, based on the Hidden Markov Model (HMM), a semantic recognition framework on compressed videos is proposed to analyze the video events according to six low-level features. After the detailed analysis of video events, the pattern of global motion and five features in foreground--the principal parts of videos, are employed as the observations of the Hidden Markov Model to classify events in videos. The applications of the proposed framework in some video event detections demonstrate the promising success of the proposed framework on semantic video analysis.

  10. RULE-BASED SENTIMENT ANALYSIS OF UKRAINIAN REVIEWS

    Directory of Open Access Journals (Sweden)

    Mariana Romanyshyn

    2013-07-01

    Full Text Available Last decade witnessed a lot of research in the field of sentiment analysis. Understanding the attitude and the emotions that people express in written text proved to be really important and helpful in sociology, political science, psychology, market research, and, of course, artificial intelligence. This paper demonstrates a rule-based approach to clause-level sentiment analysis of reviews in Ukrainian. The general architecture of the implemented sentiment analysis system is presented, the current stage of research is described and further work is explained. The main emphasis is made on the design of rules for computing sentiments.

  11. Stability analysis of underground engineering based on multidisciplinary design optimization

    Institute of Scientific and Technical Information of China (English)

    MA Rong; ZHOU Ke-ping; GAO Feng

    2008-01-01

    Aiming at characteristics of underground engineering,analyzed the feasibility of Multidisciplinary Design Optimization (MDO) used in underground engineering,and put forward a modularization-based MDO method and the idea of MDO to resolve problems in stability analysis,proving the validity and feasibility of using MDO in underground engineering.Characteristics of uncertainty,complexity and nonlinear become bottle-neck to carry on underground engineering stability analysis by MDO.Therefore,the application of MDO in underground engineering stability analysis is still at a stage of exploration,which need some deep research.

  12. Stability analysis of underground engineering based on multidisciplinary design optimization

    Institute of Scientific and Technical Information of China (English)

    MA Rong; ZHOU Ke-ping; GAO Feng

    2008-01-01

    Aiming at characteristics of underground engineering, analyzed the feasibility of Multidisciplinary Design Optimization (MDO) used in underground engineering, and put forward a modularization-based MDO method and the idea of MDO to resolve problems in stability analysis, proving the validity and feasibility of using MDO in underground engi-neering. Characteristics of uncertainty, complexity and nonlinear become bottle-neck to carry on underground engineering stability analysis by MDO. Therefore, the application of MDO in underground engineering stability analysis is still at a stage of exploration, which need some deep research.

  13. 无模型自适应控制在飞机防滑刹车中的应用%Application of Model Free Control Technology on the Aircraft Anti-skid Brake Systems

    Institute of Scientific and Technical Information of China (English)

    时伟; 刘文胜; 陈建群

    2012-01-01

    Considering the complexity and nonlinear of aircraft antiskid braking system, by analyzing the principle of aircraft anti -skid braking system, an aircraft anti-skid brake control algorithm based on model free adaptive control is presented. Not need to establish accurate dynamic model, the method can achieve optimal slip ratio control by using the information of braking system input and output. Simulation results show that the control scheme based on model free adaptive control achieve stable slip rate within five seconds, and provides a new way of improving the efficiency of the aircraft brake.%针对飞机防滑刹车系统的复杂性和非线性,在分析滑移率控制式飞机防滑刹车系统的工作原理基础上,提出了一种基于无模型自适应控制的飞机防滑刹车控制算法;该算法无需精确的动力学模型,直接利用输入输出信息实现飞机防滑刹车的最佳滑移率控制;仿真结果表明:采用无模自适应防滑刹车控制算法,在5s之内就能获得稳定的滑移率,为提高飞机刹车的效率提供了一条新的思路.

  14. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  15. Research of second harmonic generation images based on texture analysis

    Science.gov (United States)

    Liu, Yao; Li, Yan; Gong, Haiming; Zhu, Xiaoqin; Huang, Zufang; Chen, Guannan

    2014-09-01

    Texture analysis plays a crucial role in identifying objects or regions of interest in an image. It has been applied to a variety of medical image processing, ranging from the detection of disease and the segmentation of specific anatomical structures, to differentiation between healthy and pathological tissues. Second harmonic generation (SHG) microscopy as a potential noninvasive tool for imaging biological tissues has been widely used in medicine, with reduced phototoxicity and photobleaching. In this paper, we clarified the principles of texture analysis including statistical, transform, structural and model-based methods and gave examples of its applications, reviewing studies of the technique. Moreover, we tried to apply texture analysis to the SHG images for the differentiation of human skin scar tissues. Texture analysis method based on local binary pattern (LBP) and wavelet transform was used to extract texture features of SHG images from collagen in normal and abnormal scars, and then the scar SHG images were classified into normal or abnormal ones. Compared with other texture analysis methods with respect to the receiver operating characteristic analysis, LBP combined with wavelet transform was demonstrated to achieve higher accuracy. It can provide a new way for clinical diagnosis of scar types. At last, future development of texture analysis in SHG images were discussed.

  16. Data-Driven Control for Interlinked AC/DC Microgrids via Model-Free Adaptive Control and Dual-Droop Control

    OpenAIRE

    Zhang, Huaguang; Zhou, Jianguo; Sun, Qiuye; Guerrero, Josep M.; Ma, Dazhong

    2016-01-01

    This paper investigates the coordinated power sharing issues of interlinked ac/dc microgrids. An appropriate control strategy is developed to control the interlinking converter (IC) to realize proportional power sharing between ac and dc microgrids. The proposed strategy mainly includes two parts: the primary outer-loop dual-droop control method along with secondary control; the inner-loop data-driven model-free adaptive voltage control. Using the proposed scheme, the interlinking converter, ...

  17. Gray Prediction Model-free Adaptive Control Strategy of Ball Mill%球磨机灰色预测无模型自适应控制策略

    Institute of Scientific and Technical Information of China (English)

    马光; 李栋; 杨晓冬

    2016-01-01

    The ball mill load control system is a complex system featuring large time hysteresis,time-varying,strong nonlinearity,multivariable and strong coupling. In order to further overcome the large time lag and uncertainty and other problems of such load control system,the scheme of gray prediction model - free adaptive control ( MFAC ) is proposed. For this control system, PID, gray prediction PID, MFAC, and gray prediction MFAC are respectively applied in simulation analysis. The applications verify the feasibility and rationality of this strategy,and it is indicated that gray prediction model-free adaptive control has good control performance and practical value.%球磨机负荷控制系统是一个具有大时滞、强时变性、强非线性及多变量与强耦合等特性的复杂系统。为了更好地克服球磨机负荷控制系统中存在的大滞后和不确定性等问题,提出了灰色预测无模型自适应控制( MFAC )策略。在控制系统下,分别采用PID、灰色预测PID、MFAC及灰色预测MFAC进行控制仿真分析。实际应用验证了该策略的可行性与合理性,表明了灰色预测无模型控制策略具有良好的控制性能和实用价值。

  18. Modeling and Grid impedance Variation Analysis of Parallel Connected Grid Connected Inverter based on Impedance Based Harmonic Analysis

    DEFF Research Database (Denmark)

    Kwon, JunBum; Wang, Xiongfei; Bak, Claus Leth;

    2014-01-01

    This paper addresses the harmonic compensation error problem existing with parallel connected inverter in the same grid interface conditions by means of impedance-based analysis and modeling. Unlike the single grid connected inverter, it is found that multiple parallel connected inverters and grid...... impedance can make influence to each other if they each have a harmonic compensation function. The analysis method proposed in this paper is based on the relationship between the overall output impedance and input impedance of parallel connected inverter, where controller gain design method, which can...

  19. A Genre-based Analysis of English Refusal Letters

    Institute of Scientific and Technical Information of China (English)

    费伟

    2014-01-01

    The present study analyzes ten English refusal letters of two subcategories based on Swales ’genre analysis model and finds that difference exists in the generic features of the two subcategorized types. Teachers should reveal the underlying rationale behind linguistic features of a specific genre so that students can not only identify genre but also apply it appropriately.

  20. Linear feature selection in texture analysis - A PLS based method

    DEFF Research Database (Denmark)

    Marques, Joselene; Igel, Christian; Lillholm, Martin

    2013-01-01

    We present a texture analysis methodology that combined uncommitted machine-learning techniques and partial least square (PLS) in a fully automatic framework. Our approach introduces a robust PLS-based dimensionality reduction (DR) step to specifically address outliers and high-dimensional featur...

  1. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  2. Graph- versus Vector-Based Analysis of a Consensus Protocol

    NARCIS (Netherlands)

    Delzanno, Giorgio; Rensink, Arend; Traverso, Riccardo; Bošnački, Dragan; Edelkamp, Stefan; Lluch Lafuente, Alberto; Wijs, Anton

    2014-01-01

    The Paxos distributed consensus algorithm is a challenging case-study for standard, vector-based model checking techniques. Due to asynchronous communication, exhaustive analysis may generate very large state spaces already for small model instances. In this paper, we show the advantages of graph tr

  3. A Corpus-based Analysis of English Noun Suffixes

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper provides a brief analysis of English suffixes. First, make a classification of the English noun suffixes etymologically; then, obtain the frequencies of each English noun suffixes in sub-corpus FR88 and WSJ88, and last draw a conclusion based on the statistics. That is from the word origins we can see its influences on English vocabulary.

  4. Spinoza II: Conceptual Case-Based Natural Language Analysis.

    Science.gov (United States)

    Schank, Roger C.; And Others

    This paper presents the theoretical changes that have developed in Conceptual Dependency Theory and their ramifications in computer analysis of natural language. The major items of concern are: the elimination of reliance on "grammar rules" for parsing with the emphasis given to conceptual rule based parsing; the development of a conceptual case…

  5. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  6. LES based POD analysis of Jet in Cross Flow

    DEFF Research Database (Denmark)

    Cavar, Dalibor; Meyer, Knud Erik; Jakirlic, S.

    2010-01-01

    The paper presents results of a POD investigation of the LES based numerical simulation of the jet-in-crossflow (JICF) flowfield. LES results are firstly compared to the pointwise LDA measurements. 2D POD analysis is then used as a comparison basis for PIV measurements and LES, and finally 3D POD...

  7. Project-Based Language Learning: An Activity Theory Analysis

    Science.gov (United States)

    Gibbes, Marina; Carson, Lorna

    2014-01-01

    This paper reports on an investigation of project-based language learning (PBLL) in a university language programme. Learner reflections of project work were analysed through Activity Theory, where tool-mediated activity is understood as the central unit of analysis for human interaction. Data were categorised according to the components of human…

  8. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  9. Situational Analysis: A Framework for Evidence-Based Practice

    Science.gov (United States)

    Annan, Jean

    2005-01-01

    Situational analysis is a framework for professional practice and research in educational psychology. The process is guided by a set of practice principles requiring that psychologists' work is evidence-based, ecological, collaborative and constructive. The framework is designed to provide direction for psychologists who wish to tailor their…

  10. Utilizing Problem-Based Learning in Qualitative Analysis Lab Experiments

    Science.gov (United States)

    Hicks, Randall W.; Bevsek, Holly M.

    2012-01-01

    A series of qualitative analysis (QA) laboratory experiments utilizing a problem-based learning (PBL) module has been designed and implemented. The module guided students through the experiments under the guise of cleaning up a potentially contaminated water site as employees of an environmental chemistry laboratory. The main goal was the…

  11. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  12. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality.

  13. Teaching-Learning Activity Modeling Based on Data Analysis

    Directory of Open Access Journals (Sweden)

    Kyungrog Kim

    2015-03-01

    Full Text Available Numerous studies are currently being carried out on personalized services based on data analysis to find and provide valuable information about information overload. Furthermore, the number of studies on data analysis of teaching-learning activities for personalized services in the field of teaching-learning is increasing, too. This paper proposes a learning style recency-frequency-durability (LS-RFD model for quantified analysis on the level of activities of learners, to provide the elements of teaching-learning activities according to the learning style of the learner among various parameters for personalized service. This is to measure preferences as to teaching-learning activity according to recency, frequency and durability of such activities. Based on the results, user characteristics can be classified into groups for teaching-learning activity by categorizing the level of preference and activity of the learner.

  14. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  15. Flow cytometry-based DNA hybridization and polymorphism analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Kommander, K.; White, P.S.; Nolan, J.P.

    1998-07-01

    Functional analysis of the humane genome, including the quantification of differential gene expression and the identification of polymorphic sites and disease genes, is an important element of the Human Genome Project. Current methods of analysis are mainly gel-based assays that are not well-suited to rapid genome-scale analyses. To analyze DNA sequence on a large scale, robust and high throughput assays are needed. The authors are developing a suite of microsphere-based approaches employing fluorescence detection to screen and analyze genomic sequence. The approaches include competitive DNA hybridization to measure DNA or RNA targets in unknown samples, and oligo ligation or extension assays to analyze single-nucleotide polymorphisms. Apart from the advances of sensitivity, simplicity, and low sample consumption, these flow cytometric approaches have the potential for high throughput multiplexed analysis using multicolored microspheres and automated sample handling.

  16. [Electroencephalogram Feature Selection Based on Correlation Coefficient Analysis].

    Science.gov (United States)

    Zhou, Jinzhi; Tang, Xiaofang

    2015-08-01

    In order to improve the accuracy of classification with small amount of motor imagery training data on the development of brain-computer interface (BCD systems, we proposed an analyzing method to automatically select the characteristic parameters based on correlation coefficient analysis. Throughout the five sample data of dataset IV a from 2005 BCI Competition, we utilized short-time Fourier transform (STFT) and correlation coefficient calculation to reduce the number of primitive electroencephalogram dimension, then introduced feature extraction based on common spatial pattern (CSP) and classified by linear discriminant analysis (LDA). Simulation results showed that the average rate of classification accuracy could be improved by using correlation coefficient feature selection method than those without using this algorithm. Comparing with support vector machine (SVM) optimization features algorithm, the correlation coefficient analysis can lead better selection parameters to improve the accuracy of classification.

  17. Unified HMM-based layout analysis framework and algorithm

    Institute of Scientific and Technical Information of China (English)

    陈明; 丁晓青; 吴佑寿

    2003-01-01

    To manipulate the layout analysis problem for complex or irregular document image, a Unified HMM-based Layout Analysis Framework is presented in this paper. Based on the multi-resolution wavelet analysis results of the document image, we use HMM method in both inner-scale image model and trans-scale context model to classify the pixel region properties, such as text, picture or background. In each scale, a HMM direct segmentation method is used to get better inner-scale classification result. Then another HMM method is used to fuse the inner-scale result in each scale and then get better final seg- mentation result. The optimized algorithm uses a stop rule in the coarse to fine multi-scale segmentation process, so the speed is improved remarkably. Experiments prove the efficiency of proposed algorithm.

  18. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    Science.gov (United States)

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...activities that take place in a particular area and generally refers to human modification of land, often for residential or economic purposes. It also...is asphaltic and is generally not economical to drill. There are some small natural gas deposits being tapped in the northwest part of the county

  19. ODVBA: optimally-discriminative voxel-based analysis.

    Science.gov (United States)

    Zhang, Tianhao; Davatzikos, Christos

    2011-08-01

    Gaussian smoothing of images prior to applying voxel-based statistics is an important step in voxel-based analysis and statistical parametric mapping (VBA-SPM) and is used to account for registration errors, to Gaussianize the data and to integrate imaging signals from a region around each voxel. However, it has also become a limitation of VBA-SPM based methods, since it is often chosen empirically and lacks spatial adaptivity to the shape and spatial extent of the region of interest, such as a region of atrophy or functional activity. In this paper, we propose a new framework, named optimally-discriminative voxel-based analysis (ODVBA), for determining the optimal spatially adaptive smoothing of images, followed by applying voxel-based group analysis. In ODVBA, nonnegative discriminative projection is applied regionally to get the direction that best discriminates between two groups, e.g., patients and controls; this direction is equivalent to local filtering by an optimal kernel whose coefficients define the optimally discriminative direction. By considering all the neighborhoods that contain a given voxel, we then compose this information to produce the statistic for each voxel. Finally, permutation tests are used to obtain a statistical parametric map of group differences. ODVBA has been evaluated using simulated data in which the ground truth is known and with data from an Alzheimer's disease (AD) study. The experimental results have shown that the proposed ODVBA can precisely describe the shape and location of structural abnormality.

  20. Environmental analysis of the eastern shale hydroretorting data base

    Energy Technology Data Exchange (ETDEWEB)

    Rex, R.C. Jr.; Lynch, P.A.

    1984-05-01

    The purpose of this study is to perform a preliminary environmental analysis of certain chemical aspects of Eastern shale hydroretorting utilizing the data from the twenty-one (21) bench scale unit runs conducted during the HYTORT Feasibility Study. The report contained herein primarily addresses the potential types and quantities of pollutants emanating directly from the hydroretorting of oil shale (i.e., the retort paper). The following areas are discussed in detail: nitrogen distribution; sulfur distribution; gas trace constituents; sour water constituents; and shale leachates. The results of the analysis have not identified any potential pollutants or quantities which cannot be brought to conformance with currently promulgated environmental standards using existing technology. Additional analysis of the process chemistry portion of the HYTORT data base, coupled with the process and mechanical design information, can provide a methodology for dealing with the identified environmental concerns as they pertain to a commercial facility. Section 5.0 of the report delineates the areas which should be addressed in a continuing analysis of environmental concerns. The suggested program divides naturally into three phases, of which Phase 1 has been completed: Phase 1 - Environmental Analysis of the Eastern Shale Hydroretorting Data Base; Phase 2 - Generic (non-site-specific) Environmental Analysis; and Phase 3 - Site-Specific Environmental Analysis. Phase 2 details the anticipated emissions from all areas of a commercial HYTORT facility operating on a typical Eastern shale using the results of this Phase 1 effort and the HYTORT data base. Phase 3 utilizes this information to assess the effects of plant emissions on chosen sites in the context of applicable laws and regulations. 7 references, 18 figures, 2 tables.

  1. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  2. Research on supplier evaluation and selection based on fuzzy hierarchy analysis and grey relational analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Considering the disadvantages of selecting evaluation index of supplier based on old purchase relation and in view of transformation of relation between manufacture and supplier under the dynamic, cooperative, competitive and quickly response environment, research on supplier selection evaluation was presented based on enterprise capability, cooperation degree and service level from the perspective of cooperative partnership and coordination, and the evaluation index system was established. A more objective and veracious supplier selection and evaluation method based on fuzzy analysis hierarchy process and grey relational analysis was developed, and then empirical research on electric equipment manufacturer was explored to analyze the supplier selection and evaluation.

  3. Model-free Adaptive Cross Coupling Control of Multi-wire Saw%多线切割机的无模型自适应交叉耦合控制

    Institute of Scientific and Technical Information of China (English)

    蒋近; 戴瑜兴; 郜克存; 彭思齐

    2012-01-01

    为提高硅材料切割加工的精度,在分析多线切割机钢丝线张力波动的基础上,将无模型自适应控制和交叉耦合控制相结合,提出了一种多轴同步的间接张力控制策略.无模型自适应控制提高了系统动态响应的快速性,增强了系统的鲁棒性,减小了跟踪误差,实现了准确跟踪;交叉耦合控制用于消除各轴之间的增益参数和动态参数不匹配的影响,减小了同步误差,实现了多轴同步.实验结果表明所提出的控制方案十分有效,可提高多轴同步的运动精度,减小铜丝线张力波动.%For enhancing silicon material sawing precision, on the basis of analyzing fluctuation of wire tension in multi-wire saw. An indirect tension control strategy of multi-axis synchronization is presented which is based on the combination of model-free adaptive control and cross coupling control. Model-free adaptive control enhances the fastness of the system dynamic response, increases system robustness, reduces tracking error, and implements the accurate tracking. Cross coupling control eliminates the influences of mismatching of the gain parameters and dynamic parameters between the axis, minishs synchronization error, and realizes multi-axis synchronization. Experimental results show that the proposed tension control method is very effective. It can improves motion precision of multi-axis synchronization and reduces fluctuation of wire tension.

  4. Finding Suitable Variability Abstractions for Family-Based Analysis

    DEFF Research Database (Denmark)

    Dimovski, Aleksandar; Brabrand, Claus; Wasowski, Andrzej

    2016-01-01

    For program families (Software Product Lines), specially designed variability-aware static (dataflow) analyses allow analyzing all variants (products) of the family, simultaneously, in a single run without generating any of the variants explicitly. They are also known as lifted or family......-based analyses. The variability-aware analyses may be too costly or even infeasible for families with a large number of variants. In order to make them computationally cheaper, we can apply variability abstractions which aim to tame the combinatorial explosion of the number of variants (configurations...... suitable variability abstractions from a large family of abstractions for a variability-aware static analysis. The idea is to use a pre-analysis to estimate the impact of variability-specific parts of the program family on the analysis's precision. Then we use the pre-analysis results to find out when...

  5. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  6. Design and Implementation of Hardware Based Entropy Analysis

    Directory of Open Access Journals (Sweden)

    S. Saravanan

    2012-07-01

    Full Text Available The aim of this study is hardware implementation of the Entropy analysis. Designing and verifying entropy analysis is the major finding of this study aper. Entropy tells how much amount of data can be compressed. Entropy analysis plays a major role in scan based SoC testing. Size and complexity have been the major issues for current scenario of System-on-a-Chip (SoC testing. Test data compression is a must for such cases. Entropy analysis is taken for both specified and unspecified bits (don’t care bits. Unspecified bits are specified using Zero and One fill algorithms. The X-filling technique is applied for fixed to fixed codes. The proposed method is successfully tested on ISCAS89 benchmark circuits.

  7. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  8. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    Science.gov (United States)

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements.

  9. Biopipe: a flexible framework for protocol-based bioinformatics analysis.

    Science.gov (United States)

    Hoon, Shawn; Ratnapu, Kiran Kumar; Chia, Jer-Ming; Kumarasamy, Balamurugan; Juguang, Xiao; Clamp, Michele; Stabenau, Arne; Potter, Simon; Clarke, Laura; Stupka, Elia

    2003-08-01

    We identify several challenges facing bioinformatics analysis today. Firstly, to fulfill the promise of comparative studies, bioinformatics analysis will need to accommodate different sources of data residing in a federation of databases that, in turn, come in different formats and modes of accessibility. Secondly, the tsunami of data to be handled will require robust systems that enable bioinformatics analysis to be carried out in a parallel fashion. Thirdly, the ever-evolving state of bioinformatics presents new algorithms and paradigms in conducting analysis. This means that any bioinformatics framework must be flexible and generic enough to accommodate such changes. In addition, we identify the need for introducing an explicit protocol-based approach to bioinformatics analysis that will lend rigorousness to the analysis. This makes it easier for experimentation and replication of results by external parties. Biopipe is designed in an effort to meet these goals. It aims to allow researchers to focus on protocol design. At the same time, it is designed to work over a compute farm and thus provides high-throughput performance. A common exchange format that encapsulates the entire protocol in terms of the analysis modules, parameters, and data versions has been developed to provide a powerful way in which to distribute and reproduce results. This will enable researchers to discuss and interpret the data better as the once implicit assumptions are now explicitly defined within the Biopipe framework.

  10. A Knowledge-based Environment for Software Process Performance Analysis

    Directory of Open Access Journals (Sweden)

    Natália Chaves Lessa Schots

    2015-08-01

    Full Text Available Background: Process performance analysis is a key step for implementing continuous improvement in software organizations. However, the knowledge to execute such analysis is not trivial and the person responsible to executing it must be provided with appropriate support. Aim: This paper presents a knowledge-based environment, named SPEAKER, proposed for supporting software organizations during the execution of process performance analysis. SPEAKER comprises a body of knowledge and a set of activities and tasks for software process performance analysis along with supporting tools to executing these activities and tasks. Method: We conducted an informal literature reviews and a systematic mapping study, which provided basic requirements for the proposed environment. We implemented the SPEAKER environment integrating supporting tools for the execution of activities and tasks of performance analysis and the knowledge necessary to execute them, in order to meet the variability presented by the characteristics of these activities. Results: In this paper, we describe each SPEAKER module and the individual evaluations of these modules, and also present an example of use comprising how the environment can guide the user through a specific performance analysis activity. Conclusion: Although we only conducted individual evaluations of SPEAKER’s modules, the example of use indicates the feasibility of the proposed environment. Therefore, the environment as a whole will be further evaluated to verify if it attains its goal of assisting in the execution of process performance analysis by non-specialist people.

  11. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  12. Analysis of system trustworthiness based on information flow noninterference theory

    Institute of Scientific and Technical Information of China (English)

    Xiangying Kong; Yanhui Chen; Yi Zhuang

    2015-01-01

    The trustworthiness analysis and evaluation are the bases of the trust chain transfer. In this paper the formal method of trustworthiness analysis of a system based on the noninterfer-ence (NI) theory of the information flow is studied. Firstly, existing methods cannot analyze the impact of the system states on the trustworthiness of software during the process of trust chain trans-fer. To solve this problem, the impact of the system state on trust-worthiness of software is investigated, the run-time mutual interfer-ence behavior of software entities is described and an interference model of the access control automaton of a system is established. Secondly, based on the intransitive noninterference (INI) theory, a formal analytic method of trustworthiness for trust chain transfer is proposed, providing a theoretical basis for the analysis of dynamic trustworthiness of software during the trust chain transfer process. Thirdly, a prototype system with dynamic trustworthiness on a plat-form with dual core architecture is constructed and a verification algorithm of the system trustworthiness is provided. Final y, the monitor hypothesis is extended to the dynamic monitor hypothe-sis, a theorem of static judgment rule of system trustworthiness is provided, which is useful to prove dynamic trustworthiness of a system at the beginning of system construction. Compared with previous work in this field, this research proposes not only a formal analytic method for the determination of system trustworthiness, but also a modeling method and an analysis algorithm that are feasible for practical implementation.

  13. Consistency analysis of accelerated degradation mechanism based on gray theory

    Institute of Scientific and Technical Information of China (English)

    Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang

    2014-01-01

    A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.

  14. Risk-based planning analysis for a single levee

    Science.gov (United States)

    Hui, Rui; Jachens, Elizabeth; Lund, Jay

    2016-04-01

    Traditional risk-based analysis for levee planning focuses primarily on overtopping failure. Although many levees fail before overtopping, few planning studies explicitly include intermediate geotechnical failures in flood risk analysis. This study develops a risk-based model for two simplified levee failure modes: overtopping failure and overall intermediate geotechnical failure from through-seepage, determined by the levee cross section represented by levee height and crown width. Overtopping failure is based only on water level and levee height, while through-seepage failure depends on many geotechnical factors as well, mathematically represented here as a function of levee crown width using levee fragility curves developed from professional judgment or analysis. These levee planning decisions are optimized to minimize the annual expected total cost, which sums expected (residual) annual flood damage and annualized construction costs. Applicability of this optimization approach to planning new levees or upgrading existing levees is demonstrated preliminarily for a levee on a small river protecting agricultural land, and a major levee on a large river protecting a more valuable urban area. Optimized results show higher likelihood of intermediate geotechnical failure than overtopping failure. The effects of uncertainty in levee fragility curves, economic damage potential, construction costs, and hydrology (changing climate) are explored. Optimal levee crown width is more sensitive to these uncertainties than height, while the derived general principles and guidelines for risk-based optimal levee planning remain the same.

  15. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  16. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...

  17. APL-based flexibility analysis of manufacturing grid

    Institute of Scientific and Technical Information of China (English)

    LIU; Li-lan; SUN; Xue-hua; CAI; Hong-xia; CHAI; Jian-fei

    2009-01-01

    With the characteristics of diversity, randomness, concurrency and decomposability, tasks in manufacturing field are very complicated, and so manufacturing grid(MG)should have considerable flexibility to deal with this problem. With the definition of node and arc, MG structure is converted into a small-world network. Given construction cost constraint, the problem of shortest task waiting time is transformed into the constrained optimization problem, and a corresponding flexibility analysis model based on average path length(APL)is proposed, and the premise of arc-length and node-distance are defined. The results of application example show that the analysis model is effective.

  18. Applying Content Analysis to Web-based Content

    OpenAIRE

    Kim, Inhwa; Kuljis, Jasna

    2010-01-01

    Using Content Analysis onWeb-based content, in particular the content available onWeb 2.0 sites, is investigated. The relative strengths and limitations of the method are described. To illustrate how content analysis may be used, we provide a brief overview of a case study that investigates cultural impacts on the use of design features with regard to self-disclosure on the blogs of South Korean and United Kingdom’s users. In this study we took a standard approach to conducting the content an...

  19. System modeling based measurement error analysis of digital sun sensors

    Institute of Scientific and Technical Information of China (English)

    WEI; M; insong; XING; Fei; WANG; Geng; YOU; Zheng

    2015-01-01

    Stringent attitude determination accuracy is required for the development of the advanced space technologies and thus the accuracy improvement of digital sun sensors is necessary.In this paper,we presented a proposal for measurement error analysis of a digital sun sensor.A system modeling including three different error sources was built and employed for system error analysis.Numerical simulations were also conducted to study the measurement error introduced by different sources of error.Based on our model and study,the system errors from different error sources are coupled and the system calibration should be elaborately designed to realize a digital sun sensor with extra-high accuracy.

  20. Towards Performance Measurement And Metrics Based Analysis of PLA Applications

    CERN Document Server

    Ahmed, Zeeshan

    2010-01-01

    This article is about a measurement analysis based approach to help software practitioners in managing the additional level complexities and variabilities in software product line applications. The architecture of the proposed approach i.e. ZAC is designed and implemented to perform preprocessesed source code analysis, calculate traditional and product line metrics and visualize results in two and three dimensional diagrams. Experiments using real time data sets are performed which concluded with the results that the ZAC can be very helpful for the software practitioners in understanding the overall structure and complexity of product line applications. Moreover the obtained results prove strong positive correlation between calculated traditional and product line measures.

  1. Logistics Enterprise Evaluation Model Based On Fuzzy Clustering Analysis

    Science.gov (United States)

    Fu, Pei-hua; Yin, Hong-bo

    In this thesis, we introduced an evaluation model based on fuzzy cluster algorithm of logistics enterprises. First of all,we present the evaluation index system which contains basic information, management level, technical strength, transport capacity,informatization level, market competition and customer service. We decided the index weight according to the grades, and evaluated integrate ability of the logistics enterprises using fuzzy cluster analysis method. In this thesis, we introduced the system evaluation module and cluster analysis module in detail and described how we achieved these two modules. At last, we gave the result of the system.

  2. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  3. A CONTENT ANALYSIS ON PROBLEM-BASED LEARNING APPROACH

    OpenAIRE

    BİBER, Mahir; Esen ERSOY; KÖSE BİBER, Sezer

    2014-01-01

    Problem Based Learning is one of the learning models that contain the general principles of active learning and students can use scientific process skills. Within this research it was aimed to investigate in detail the postgraduate thesis held in Turkey about PBL approach. The content analysis method was used in the research. The study sample was consisted of a total of 64 masters and PhD thesis made between the years 2012-2013 and reached over the web. A “Content Analysis Template” prepared ...

  4. A CONTENT ANALYSIS ON PROBLEM-BASED LEARNING APPROACH

    OpenAIRE

    BİBER, Mahir; Esen ERSOY; KÖSE BİBER, Sezer

    2015-01-01

    Problem Based Learning is one of the learning models that contain the general principles of active learning and students can use scientific process skills. Within this research it was aimed to investigate in detail the postgraduate thesis held in Turkey about PBL approach. The content analysis method was used in the research. The study sample was consisted of a total of 64 masters and PhD thesis made between the years 2012-2013 and reached over the web. A “Content Analysis Template” prepared ...

  5. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  6. Sentiment analysis framework organization based on twitter corpus data

    Directory of Open Access Journals (Sweden)

    Adela Beres

    2012-06-01

    Full Text Available Since its inception in 2006, Twitter has gathered millions of users. They post daily tweets about news, events or conversations. These tweets express their opinion about the topic they are discussing. Twitter is a large database of content that can be semantically exploited to extract opinions and based on these opinions to classify the users. This paper presents the organization of a sentiment analysis framework based on Twitter corpus data, including crawling tweets and opinion mining of the tweets, making it easy for its users to create portfolios of trustful Twitter accounts.

  7. Coverage analysis for sensor networks based on Clifford algebra

    Institute of Scientific and Technical Information of China (English)

    XIE WeiXin; CAO WenMing; MENG Shan

    2008-01-01

    The coverage performance is the foundation of information acquisition in distrib-uted sensor networks. The previously proposed coverage work was mostly based on unit disk coverage model or ball coverage model in 2D or 3D space, respectively. However, most methods cannot give a homogeneous coverage model for targets with hybrid types. This paper presents a coverage analysis approach for sensor networks based on Clifford algebra and establishes a homogeneous coverage model for sensor networks with hybrid types of targets. The effectiveness of the approach is demonstrated with examples.

  8. Image registration based on matrix perturbation analysis using spectral graph

    Institute of Scientific and Technical Information of China (English)

    Chengcai Leng; Zheng Tian; Jing Li; Mingtao Ding

    2009-01-01

    @@ We present a novel perspective on characterizing the spectral correspondence between nodes of the weighted graph with application to image registration.It is based on matrix perturbation analysis on the spectral graph.The contribution may be divided into three parts.Firstly, the perturbation matrix is obtained by perturbing the matrix of graph model.Secondly, an orthogonal matrix is obtained based on an optimal parameter, which can better capture correspondence features.Thirdly, the optimal matching matrix is proposed by adjusting signs of orthogonal matrix for image registration.Experiments on both synthetic images and real-world images demonstrate the effectiveness and accuracy of the proposed method.

  9. Image Analysis of Fabric Pilling Based on Light Projection

    Institute of Scientific and Technical Information of China (English)

    陈霞; 黄秀宝

    2003-01-01

    The objective assessment of fabric pilling based on light projection and image analysis has been exploited recently.The device for capturing the cross-sectional images of the pilled fabrics with light projection is elaborated.The detection of the profile line and integration of the sequential cross-sectional pilled image are discussed.The threshold based on Gaussian model is recommended for pill segmentation.The results show that the installed system is capable of eliminating the interference with pill information from the fabric color and pattern.

  10. Plug-in Based Analysis Framework for LHC Post-Mortem Analysis

    CERN Document Server

    Gorbonosov, R; Zerlauth, M; Baggiolini, V

    2014-01-01

    Plug-in based software architectures [1] are extensible, enforce modularity and allow several teams to work in parallel. But they have certain technical and organizational challenges, which we discuss in this paper. We gained our experience when developing the Post-Mortem Analysis (PMA) system, which is a mission critical system for the Large Hadron Collider (LHC). We used a plugin-based architecture with a general-purpose analysis engine, for which physicists and equipment experts code plugins containing the analysis algorithms. We have over 45 analysis plugins developed by a dozen of domain experts. This paper focuses on the design challenges we faced in order to mitigate the risks of executing third-party code: assurance that even a badly written plugin doesn't perturb the work of the overall application; plugin execution control which allows to detect plugin misbehaviour and react; robust communication mechanism between plugins, diagnostics facilitation in case of plugin failure; testing of the plugins be...

  11. Classification analysis of microarray data based on ontological engineering

    Institute of Scientific and Technical Information of China (English)

    LI Guo-qi; SHENG Huan-ye

    2007-01-01

    Background knowledge is important for data mining, especially in complicated situation. Ontological engineering is the successor of knowledge engineering. The sharable knowledge bases built on ontology can be used to provide background knowledge to direct the process of data mining. This paper gives a common introduction to the method and presents a practical analysis example using SVM (support vector machine) as the classifier. Gene Ontology and the accompanying annotations compose a big knowledge base, on which many researches have been carried out. Microarray dataset is the output of DNA chip.With the help of Gene Ontology we present a more elaborate analysis on microarray data than former researchers. The method can also be used in other fields with similar scenario.

  12. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  13. INNOVATION ANALYSIS BASED ON SCORES AT THE FIRM LEVEL

    Directory of Open Access Journals (Sweden)

    Cătălin George ALEXE

    2014-04-01

    Full Text Available Innovation analysis based on scores (Innovation Scorecard is a simple way to get a quick diagnosis on the potential of innovation of a firm in its intention to achieve the innovation capability. It aims to identify and remedy the deficient aspects related to innovation management being used as a measuring tool for the innovation initiatives over time within the innovation audit. The paper aims to present the advantages and disadvantages of using the method, and the three approaches developed over time. Therefore, the model proposed by the consulting firm Arthur D. Little in collaboration with the European Business School, Eckelmann's model and AGGB's local model are summarized and compared. At the end of the paper, several possible solutions are proposed to improve the way of analysis based on scores.

  14. Protein analysis based on molecular beacon probes and biofunctionalized nanoparticles

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    With the completion of the human genome-sequencing project, there has been a resulting change in the focus of studies from genomics to proteomics. By utilizing the inherent advantages of molecular beacon probes and biofunctionalized nanoparticles, a series of novel principles, methods and techniques have been exploited for bioanalytical and biomedical studies. This review mainly discusses the applications of molecular beacon probes and biofunctionalized nanoparticles-based technologies for realtime, in-situ, highly sensitive and highly selective protein analysis, including the nonspecific or specific protein detection and separation, protein/DNA interaction studies, cell surface protein recognition, and antigen-antibody binding process-based bacteria assays. The introduction of molecular beacon probes and biofunctionalized nanoparticles into the protein analysis area would necessarily advance the proteomics research.

  15. Model-Based Dependability Analysis of Physical Systems with Modelica

    Directory of Open Access Journals (Sweden)

    Andrea Tundis

    2017-01-01

    Full Text Available Modelica is an innovative, equation-based, and acausal language that allows modeling complex physical systems, which are made of mechanical, electrical, and electrotechnical components, and evaluates their design through simulation techniques. Unfortunately, the increasing complexity and accuracy of such physical systems require new, more powerful, and flexible tools and techniques for evaluating important system properties and, in particular, the dependability ones such as reliability, safety, and maintainability. In this context, the paper describes some extensions of the Modelica language to support the modeling of system requirements and their relationships. Such extensions enable the requirement verification analysis through native constructs in the Modelica language. Furthermore, they allow exporting a Modelica-based system design as a Bayesian Network in order to analyze its dependability by employing a probabilistic approach. The proposal is exemplified through a case study concerning the dependability analysis of a Tank System.

  16. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    Computational approaches to social media analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. There are no other unified modelling approaches to social data that integrate...... from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a global brand...... and actors on the facebook page. Sixth and last, we discuss the analytical method and conclude with a discussion of the benefits of set theoretical approaches based on the social philosophical approach of associational sociology....

  17. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Abstract—Computational approaches to social media analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. There are no other unified modelling approaches to social data that integrate...... from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a global brand...... and actors on the facebook page. Sixth and last, we discuss the analytical method and conclude with a discussion of the benefits of set theoretical approaches based on the social philosophical approach of associational sociology....

  18. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  19. Transmissibility-Based Operational Modal Analysis: Enhanced Stabilisation Diagrams

    Directory of Open Access Journals (Sweden)

    Gert De Sitter

    2012-01-01

    Full Text Available Recently it has been shown that also transmissibilities can be used to identify the modal parameters. This approach has several advantages: because of the deterministic character of the transmissibility functions, the estimated parameters are more accurate than the results obtained with the power spectra based operational modal analysis techniques. Another advantage is that the transmissibility functions do not depend on the colouring of the unknown forces. A disadvantage of the transmissibility based operational modal analysis techniques is that non-physical modes show up in the stabilisation diagrams. In this contribution it will first be shown that those non-physical modes will show up when traditional stabilisation diagrams are used. In a second step, a new approach of selecting the physical modes out of a set of estimated modes will be discussed and the new approach will be validated using data generated with an acoustical Finite Element Model. Finally, the approach will be validated using real acoustical data.

  20. [Ecological security assessment of Tangshan City based on emergy analysis].

    Science.gov (United States)

    Cao, Ming-lan; Li, Ya-dong

    2009-09-01

    Based on 'pressure-state-response' model and by using emergy analysis method, the urban ecological security assessment system and urban ecological security index (EUESI) were constructed, and the variation of ecological security level of Tangshan City in 1995-2005 was evaluated. During this period, the ecological security level of the city increased first and decreased then. The EUESI increased from 0.017 in 1995 to 0.022 in 1996, then dropped yearly, and became unsecure in 2003. The urban ecological security assessment method based on emergy analysis overcame the disadvantages of conventional assessment system, e.g., numerous and repetitive indicators, non-uniform units, and poor comparability, and reflected the urban ecological security state more objectively, being able to provide scientific basis for urban ecological environment management and decision-making.

  1. Image-Analysis Based on Seed Phenomics in Sesame

    Directory of Open Access Journals (Sweden)

    Prasad R.

    2014-10-01

    Full Text Available The seed coat (testa structure of twenty-three cultivated (Sesamum indicum L. and six wild sesame (s. occidentale Regel & Heer., S. mulayanum Nair, S. prostratum Retz., S. radiatum Schumach. & Thonn., S. angustifolium (Oliv. Engl. and S. schinzianum Asch germplasm was analyzed from digital and Scanning Electron Microscopy (SEM images with dedicated software using the descriptors for computer based seed image analysis to understand the diversity of seed morphometric traits, which later on can be extended to screen and evaluate improved genotypes of sesame. Seeds of wild sesame species could conveniently be distinguished from cultivated varieties based on shape and architectural analysis. Results indicated discrete ‘cut off values to identify definite shape and contour of seed for a desirable sesame genotype along with the con-ventional practice of selecting lighter colored testa.

  2. HIV/AIDS counseling: analysis based on Paulo Freire.

    Science.gov (United States)

    Miranda, Karla Corrêa Lima; Barroso, Maria Grasiela Teixeira

    2007-01-01

    The study aimed to investigate the strategies health professionals use in HIV/AIDS counseling. This study is a qualitative research, based on Paulo Freire's theory and practice. Bardin's content analysis was used as the analysis technique. For the studied group, the counseling is focused on cognition, although new concepts permeating this subject are emerging. The main difficulties in counseling are related to the clients and the institution. The main facility is related to the team, which according to the group has a good relationship. Counseling represents a moment of distress, especially because it brings up existential questions to the counselor. It can be inferred that counseling is a special moment, but it does not constitute an educational moment yet. To obtain this goal, a counseling methodology is proposed, based on Paulo Freire's principles and concepts.

  3. CORBA-Based Analysis of Multi Agent Behavior

    Institute of Scientific and Technical Information of China (English)

    Swapan Bhattacharya; Anirban Banerjee; Shibdas Bandyopadhyay

    2005-01-01

    An agent is a computer software that is capable of taking independent action on behalf of its user or owner. It is an entity with goals, actions and domain knowledge, situated in an environment. Multiagent systems comprises of multiple autonomous, interacting computer software, or agents. These systems can successfully emulate the entities active in a distributed environment. The analysis of multiagent behavior has been studied in this paper based on a specific board game problem similar to the famous problem of GO. In this paper a framework is developed to define the states of the multiagent entities and measure the convergence metrics for this problem. An analysis of the changes of states leading to the goal state is also made. We support our study of multiagent behavior by simulations based on a CORBA framework in order to substantiate our findings.

  4. The OASE project: Object-based Analysis and Seamless prediction

    Science.gov (United States)

    Troemel, Silke; Wapler, Kathrin; Bick, Theresa; Diederich, Malte; Deneke, Hartwig; Horvath, Akos; Senf, Fabian; Simmer, Clemens; Simon, Juergen

    2013-04-01

    The research group on Object-based Analysis and SEamless prediction (OASE) is part of the Hans Ertel Centre for Weather Research (HErZ). The group consists of scientists at the Meteorological Institute, University of Bonn, the Leibniz-Institute for Tropospheric Research in Leipzig and the German Weather Service. OASE addresses seamless prediction of convective events from nowcasting to daily predictions by combining radar/satellite compositing and tracking with high-resolution model-based ensemble generation and prediction. While observation-based nowcasting provides good results for lead times between 0-1 hours, numerical weather prediction addresses lead times between 3-21 hours. Especially the discontinuity between 1-3 hours needs to be addressed. Therefore a central goal of the project is a near real-time high-resolved unprecedented data base. A radar and satellite remote sensing-driven 3D observation-microphysics composite covering Germany, currently under development, contains gridded observations and estimated microphysical quantities. Observations and microphysics are intertwined via forward operators and estimated inverse relations, which also provide uncertainties for model ensemble initialisations. The lifetime evolution of dynamics and microphysics in (severe) convective storms is analysed based on 3D scale-space tracking. An object-based analysis condenses the information contained in the dynamic 3D distributions of observables and related microphysics into descriptors, which will allow identifying governing processes leading to the formation and evolution of severe weather events. The object-based approach efficiently characterises and quantifies the process structure and life cycles of severe weather events, and facilitates nowcasting and the generation and initialisation of model prediction ensembles. Observation-based nowcasting will exploit the dual-composite based 3D feature detection and tracking to generate a set of predictions (observation-based

  5. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  6. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  7. Student Engagement: A Principle-Based Concept Analysis.

    Science.gov (United States)

    Bernard, Jean S

    2015-08-04

    A principle-based concept analysis of student engagement was used to examine the state of the science across disciplines. Four major perspectives of philosophy of science guided analysis and provided a framework for study of interrelationships and integration of conceptual components which then resulted in formulation of a theoretical definition. Findings revealed student engagement as a dynamic reiterative process marked by positive behavioral, cognitive, and affective elements exhibited in pursuit of deep learning. This process is influenced by a broader sociocultural environment bound by contextual preconditions of self-investment, motivation, and a valuing of learning. Outcomes of student engagement include satisfaction, sense of well-being, and personal development. Findings of this analysis prove relevant to nursing education as faculty transition from traditional teaching paradigms, incorporate learner-centered strategies, and adopt innovative pedagogical methodologies. It lends support for curricula reform, development of more accurate evaluative measures, and creation of meaningful teaching-learning environments within the discipline.

  8. SVM-based glioma grading: Optimization by feature reduction analysis.

    Science.gov (United States)

    Zöllner, Frank G; Emblem, Kyrre E; Schad, Lothar R

    2012-09-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity=89%, specificity=84%) when reducing the feature vector from 101 (100-bins rCBV histogram+age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∼87%) while reducing the number of features by up to 98%.

  9. Tariff-based analysis of commercial building electricityprices

    Energy Technology Data Exchange (ETDEWEB)

    Coughlin, Katie M.; Bolduc, Chris A.; Rosenquist, Greg J.; VanBuskirk, Robert D.; McMahon, James E.

    2008-03-28

    This paper presents the results of a survey and analysis ofelectricity tariffs and marginal electricity prices for commercialbuildings. The tariff data come from a survey of 90 utilities and 250tariffs for non-residential customers collected in 2004 as part of theTariff Analysis Project at LBNL. The goals of this analysis are toprovide useful summary data on the marginal electricity prices commercialcustomers actually see, and insight into the factors that are mostimportant in determining prices under different circumstances. We providea new, empirically-based definition of several marginal prices: theeffective marginal price and energy-only anddemand-only prices, andderive a simple formula that expresses the dependence of the effectivemarginal price on the marginal load factor. The latter is a variable thatcan be used to characterize the load impacts of a particular end-use orefficiency measure. We calculate all these prices for eleven regionswithin the continental U.S.

  10. Analysis Of Japans Economy Based On 2014 From Macroeconomics Prospects

    Directory of Open Access Journals (Sweden)

    Dr Mohammad Rafiqul Islam

    2015-02-01

    Full Text Available Abstract Japan is the worlds third largest economy. But currently economic situations of Japan are not stable. It is not increasing as expected. Since 2013 it was world second largest economy but Japan loosed its placed to China in 2014 due to slow growth of important economic indicators. By using the basic Keynesian model we will provide a detailed analysis of the short and long run impacts of the changes for Japans real GDP rate of unemployment and inflation rate. We demonstrated a detailed use of the 45-degree diagram or the AD-IA model and other economic analysis of the macroeconomic principles that underlie the model and concepts. Finally we will recommend the government with a change in fiscal policy what based on the analysis by considering what might be achieved with a fiscal policy response and the extent to which any impact on the stock of public debt might be a consideration

  11. Error Analysis of Robotic Assembly System Based on Screw Theory

    Institute of Scientific and Technical Information of China (English)

    韩卫军; 费燕琼; 赵锡芳

    2003-01-01

    Assembly errors have great influence on assembly quality in robotic assembly systems. Error analysis is directed to the propagations and accumula-tions of various errors and their effect on assembly success.Using the screw coordinates, assembly errors are represented as "error twist", the extremely compact expression. According to the law of screw composition, relative position and orientation errors of mating parts are computed and the necessary condition of assembly success is concluded. A new simple method for measuring assembly errors is also proposed based on the transformation law of a screw.Because of the compact representation of error, the model presented for error analysis can be applied to various part- mating types and especially useful for error analysis of complexity assembly.

  12. GOMA: functional enrichment analysis tool based on GO modules

    Institute of Scientific and Technical Information of China (English)

    Qiang Huang; Ling-Yun Wu; Yong Wang; Xiang-Sun Zhang

    2013-01-01

    Analyzing the function of gene sets is a critical step in interpreting the results of high-throughput experiments in systems biology.A variety of enrichment analysis tools have been developed in recent years,but most output a long list of significantly enriched terms that are often redundant,making it difficult to extract the most meaningful functions.In this paper,we present GOMA,a novel enrichment analysis method based on the new concept of enriched functional Gene Ontology (GO) modules.With this method,we systematically revealed functional GO modules,i.e.,groups of functionally similar GO terms,via an optimization model and then ranked them by enrichment scores.Our new method simplifies enrichment analysis results by reducing redundancy,thereby preventing inconsistent enrichment results among functionally similar terms and providing more biologically meaningful results.

  13. Fractal Analysis Based on Hierarchical Scaling in Complex Systems

    CERN Document Server

    Chen, Yanguang

    2016-01-01

    A fractal is in essence a hierarchy with cascade structure, which can be described with a set of exponential functions. From these exponential functions, a set of power laws indicative of scaling can be derived. Hierarchy structure and spatial network proved to be associated with one another. This paper is devoted to exploring the theory of fractal analysis of complex systems by means of hierarchical scaling. Two research methods are utilized to make this study, including logic analysis method and empirical analysis method. The main results are as follows. First, a fractal system such as Cantor set is described from the hierarchical angle of view; based on hierarchical structure, three approaches are proposed to estimate fractal dimension. Second, the hierarchical scaling can be generalized to describe multifractals, fractal complementary sets, and self-similar curve such as logarithmic spiral. Third, complex systems such as urban system are demonstrated to be a self-similar hierarchy. The human settlements i...

  14. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950s and 1960s,Contractive Analysis (CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors. CA soon lost its popularity.Error Analysis (EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  15. Frailty phenotypes in the elderly based on cluster analysis

    DEFF Research Database (Denmark)

    Dato, Serena; Montesanto, Alberto; Lagani, Vincenzo

    2012-01-01

    genetic background on the frailty status is still questioned. We investigated the applicability of a cluster analysis approach based on specific geriatric parameters, previously set up and validated in a southern Italian population, to two large longitudinal Danish samples. In both cohorts, we identified...... groups of subjects homogeneous for their frailty status and characterized by different survival patterns. A subsequent survival analysis availing of Accelerated Failure Time models allowed us to formulate an operative index able to correlate classification variables with survival probability. From...... these models, we quantified the differential effect of various parameters on survival, and we estimated the heritability of the frailty phenotype by exploiting the twin pairs in our sample. These data suggest the presence of a genetic influence on the frailty variability and indicate that cluster analysis can...

  16. Computer Vision-Based Image Analysis of Bacteria.

    Science.gov (United States)

    Danielsen, Jonas; Nordenfelt, Pontus

    2017-01-01

    Microscopy is an essential tool for studying bacteria, but is today mostly used in a qualitative or possibly semi-quantitative manner often involving time-consuming manual analysis. It also makes it difficult to assess the importance of individual bacterial phenotypes, especially when there are only subtle differences in features such as shape, size, or signal intensity, which is typically very difficult for the human eye to discern. With computer vision-based image analysis - where computer algorithms interpret image data - it is possible to achieve an objective and reproducible quantification of images in an automated fashion. Besides being a much more efficient and consistent way to analyze images, this can also reveal important information that was previously hard to extract with traditional methods. Here, we present basic concepts of automated image processing, segmentation and analysis that can be relatively easy implemented for use with bacterial research.

  17. Error Analysis of English Writing Based on Interlanguage Theory

    Institute of Scientific and Technical Information of China (English)

    李玲

    2014-01-01

    Language learning process has been hunted by learner’s errors,which is an unavoidable phenomenon.In the 1950 s and 1960 s,Contractive Analysis(CA) based on behaviorism and structuralism was generally employed in analyzing learners’ errors.CA soon lost its popularity.Error Analysis(EA),a branch of applied linguistics,has made great contributions to the study of second language learning and throws some light on the process of second language learning.Careful study of the errors reveals the common problems shared by the language learners.Writing is important in language learning process.Under Chinese context,English writing is always a difficult question for Chinese teachers and students,so errors in students’ written works are unavoidable.In this thesis,the author studies on error analysis of English writing with the interlanguage theory as its theoretical guidance.

  18. INDIVIDUAL COMMUNICATION TRANSMITTER IDENTIFICATION BASED ON MULTIFRACTAL ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Ren Chunhui; Wei Ping; Lou Zhiyou; Xiao Xianci

    2005-01-01

    In this letter, the communication transmitter transient signals are analyzed based on the time-variant hierarchy exponents of multifractal analysis. The species of optimized sample set is selected as the template of transmitter identification, so that the individual communication transmitter identification can be realized. The turn-on signals of four transmitters are used in the simulation. The experimental results show that the multifractal character of transmitter transient signals is an effective character of individual transmitter identification.

  19. Wavelet Variance Analysis of EEG Based on Window Function

    Institute of Scientific and Technical Information of China (English)

    ZHENG Yuan-zhuang; YOU Rong-yi

    2014-01-01

    A new wavelet variance analysis method based on window function is proposed to investigate the dynamical features of electroencephalogram (EEG).The ex-prienmental results show that the wavelet energy of epileptic EEGs are more discrete than normal EEGs, and the variation of wavelet variance is different between epileptic and normal EEGs with the increase of time-window width. Furthermore, it is found that the wavelet subband entropy (WSE) of the epileptic EEGs are lower than the normal EEGs.

  20. COMMERCIAL VIABILITY ANALYSIS OF LIGNIN BASED CARBON FIBRE

    OpenAIRE

    2014-01-01

    Lignin is a rich renewable source of aromatic compounds. As a potentialpetroleum feedstock replacement, lignin can reduce environmental impacts such ascarbon emission. Due to its complex chemical structure, lignin is currently underutilized.Exploiting lignin as a precursor for carbon fibre adds high economic value to lignin andencourages further development in lignin extraction technology. This report includes apreliminary cost analysis and identifies the key aspects of lignin-based carbon fi...

  1. Study on Segmented Reflector Lamp Design Based on Error Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper discusses the basic principle and design m ethod for light distribution of car lamp, introduces an important development: h igh efficient and flexible car lamp with reflecting light distribution-segmente d reflector (multi-patch) car lamp, and puts out a design method for segmented reflector based on error analysis. Unlike classical car lamp with refractive lig ht distribution, the method of reflecting light distribution gives car lamp desi gn more flexibility. In the case of guarantying the li...

  2. Contributions to Physics-Based Aeroservoelastic Uncertainty Analysis

    Science.gov (United States)

    Wu, Sang

    The thesis presents the development of a new fully-integrated, MATLAB based simulation capability for aeroservoelastic (ASE) uncertainty analysis that accounts for uncertainties in all disciplines as well as discipline interactions. This new capability allows probabilistic studies of complex configuration at a scope and with depth not known before. Several statistical tools and methods have been integrated into the capability to guide the tasks such as parameter prioritization, uncertainty reduction, and risk mitigation. (Abstract shortened by ProQuest.).

  3. Performance Analysis of STFT Based Timing Approach to OFDM Systems

    Institute of Scientific and Technical Information of China (English)

    KUANG Yu-jun; TENG Yong; YIN Chang-chuan; HAO Jian-jun; YUE Guang-xin

    2003-01-01

    This paper mainly focuses on performance analysis of the previously proposed STFT based 2-D timing approach to OFDM systems and presents simulations results of its performance in AWGN and multipath fading environment and its robustness against the duration of Channel Impulse Response (CIR) and frequency offset. Simulation results suggest that a revised version of Short-Time Fourier Transform (STFT) can be used to greatly reduce computational complexity, especially at higher SNR.

  4. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  5. Dynamic network-based epistasis analysis: Boolean examples

    Directory of Open Access Journals (Sweden)

    Eugenio eAzpeitia

    2011-12-01

    Full Text Available In this review we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the topologies of gene interactions infered. This has been acknowledged in several previous papers and reviews, but here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson (herein, classical epistasis, defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus. Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct gene interaction topologies are hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our review complements previous accounts, not

  6. Moon-Based INSAR Geolocation and Baseline Analysis

    Science.gov (United States)

    Liu, Guang; Ren, Yuanzhen; Ye, Hanlin; Guo, Huadong; Ding, Yixing; Ruan, Zhixing; Lv, Mingyang; Dou, Changyong; Chen, Zhaoning

    2016-07-01

    Earth observation platform is a host, the characteristics of the platform in some extent determines the ability for earth observation. Currently most developing platforms are satellite, in contrast carry out systematic observations with moon based Earth observation platform is still a new concept. The Moon is Earth's only natural satellite and is the only one which human has reached, it will give people different perspectives when observe the earth with sensors from the moon. Moon-based InSAR (SAR Interferometry), one of the important earth observation technology, has all-day, all-weather observation ability, but its uniqueness is still a need for analysis. This article will discuss key issues of geometric positioning and baseline parameters of moon-based InSAR. Based on the ephemeris data, the position, liberation and attitude of earth and moon will be obtained, and the position of the moon-base SAR sensor can be obtained by coordinate transformation from fixed seleno-centric coordinate systems to terrestrial coordinate systems, together with the Distance-Doppler equation, the positioning model will be analyzed; after establish of moon-based InSAR baseline equation, the different baseline error will be analyzed, the influence of the moon-based InSAR baseline to earth observation application will be obtained.

  7. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the 'per-pixel paradigm' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  8. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  9. Weather data analysis based on typical weather sequence analysis. Application: energy building simulation

    CERN Document Server

    David, Mathieu; Garde, Francois; Boyer, Harry

    2014-01-01

    In building studies dealing about energy efficiency and comfort, simulation software need relevant weather files with optimal time steps. Few tools generate extreme and mean values of simultaneous hourly data including correlation between the climatic parameters. This paper presents the C++ Runeole software based on typical weather sequences analysis. It runs an analysis process of a stochastic continuous multivariable phenomenon with frequencies properties applied to a climatic database. The database analysis associates basic statistics, PCA (Principal Component Analysis) and automatic classifications. Different ways of applying these methods will be presented. All the results are stored in the Runeole internal database that allows an easy selection of weather sequences. The extreme sequences are used for system and building sizing and the mean sequences are used for the determination of the annual cooling loads as proposed by Audrier-Cros (Audrier-Cros, 1984). This weather analysis was tested with the datab...

  10. A linear mixture analysis-based compression for hyperspectral image analysis

    Energy Technology Data Exchange (ETDEWEB)

    C. I. Chang; I. W. Ginsberg

    2000-06-30

    In this paper, the authors present a fully constrained least squares linear spectral mixture analysis-based compression technique for hyperspectral image analysis, particularly, target detection and classification. Unlike most compression techniques that directly deal with image gray levels, the proposed compression approach generates the abundance fractional images of potential targets present in an image scene and then encodes these fractional images so as to achieve data compression. Since the vital information used for image analysis is generally preserved and retained in the abundance fractional images, the loss of information may have very little impact on image analysis. In some occasions, it even improves analysis performance. Airborne visible infrared imaging spectrometer (AVIRIS) data experiments demonstrate that it can effectively detect and classify targets while achieving very high compression ratios.

  11. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various...... kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...... to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified....

  12. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    Directory of Open Access Journals (Sweden)

    Jingzheng Ren

    2013-01-01

    Full Text Available Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified.

  13. Emergy analysis and sustainability efficiency analysis of different crop-based biodiesel in life cycle perspective.

    Science.gov (United States)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna; Fedele, Andrea; Scipioni, Antonio

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified.

  14. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  15. Morphometric analysis of the cranial base in Asians.

    Science.gov (United States)

    Chang, Hong-Po; Liu, Pao-Hsin; Tseng, Yu-Chuan; Yang, Yi-Hsin; Pan, Chin-Yun; Chou, Szu-Ting

    2014-01-01

    This study tested the hypothesis that developmental heterogeneity in cranial base morphology increases the prevalence of Class III malocclusion and mandibular prognathism in Asians. Thin-plate spline (TPS) graphical analysis of lateral cephalometric radiographs of the cranial base and the upper midface configuration were compared between a European-American group (24 females and 31 males) and four Asian ethnic groups (100 Chinese, 100 Japanese, 100 Korean and 100 Taiwanese; 50 females and 50 males per group) of young adults with clinically acceptable occlusion and facial profiles. Procrustes analysis was performed to identify statistically significant differences in each configuration of landmarks (P expansion in the anterior portion of the cranial base and upper midface region. The most posterior cranial base region also showed horizontal compression between the basion and Bolton point, with forward displacement of the articulare. Facial flatness and anterior displacement of the temporomandibular joint, resulting from a relative retrusion of the nasomaxillary complex and a relative forward position of the mandible were also noted. These features that tend to cause a prognathic mandible and/or retruded midface indicate a morphologic predisposition of Asian populations for Class III malocclusion.

  16. Direct DNA Analysis with Paper-Based Ion Concentration Polarization.

    Science.gov (United States)

    Gong, Max M; Nosrati, Reza; San Gabriel, Maria C; Zini, Armand; Sinton, David

    2015-11-01

    DNA analysis is essential for diagnosis and monitoring of many diseases. Conventional DNA testing is generally limited to the laboratory. Increasing access to relevant technologies can improve patient care and outcomes in both developed and developing regions. Here, we demonstrate direct DNA analysis in paper-based devices, uniquely enabled by ion concentration polarization at the interface of patterned nanoporous membranes in paper (paper-based ICP). Hepatitis B virus DNA targets in human serum are simultaneously preconcentrated, separated, and detected in a single 10 min operation. A limit of detection of 150 copies/mL is achieved without prior viral load amplification, sufficient for early diagnosis of hepatitis B. We clinically assess the DNA integrity of sperm cells in raw human semen samples. The percent DNA fragmentation results from the paper-based ICP devices strongly correlate (R(2) = 0.98) with the sperm chromatin structure assay. In all cases, agreement was 100% with respect to the clinical decision. Paper-based ICP can provide inexpensive and accessible advanced molecular diagnostics.

  17. Reliability analysis method for slope stability based on sample weight

    Directory of Open Access Journals (Sweden)

    Zhi-gang YANG

    2009-09-01

    Full Text Available The single safety factor criteria for slope stability evaluation, derived from the rigid limit equilibrium method or finite element method (FEM, may not include some important information, especially for steep slopes with complex geological conditions. This paper presents a new reliability method that uses sample weight analysis. Based on the distribution characteristics of random variables, the minimal sample size of every random variable is extracted according to a small sample t-distribution under a certain expected value, and the weight coefficient of each extracted sample is considered to be its contribution to the random variables. Then, the weight coefficients of the random sample combinations are determined using the Bayes formula, and different sample combinations are taken as the input for slope stability analysis. According to one-to-one mapping between the input sample combination and the output safety coefficient, the reliability index of slope stability can be obtained with the multiplication principle. Slope stability analysis of the left bank of the Baihetan Project is used as an example, and the analysis results show that the present method is reasonable and practicable for the reliability analysis of steep slopes with complex geological conditions.

  18. Voxel-based texture analysis of the brain.

    Science.gov (United States)

    Maani, Rouzbeh; Yang, Yee Hong; Kalra, Sanjay

    2015-01-01

    This paper presents a novel voxel-based method for texture analysis of brain images. Texture analysis is a powerful quantitative approach for analyzing voxel intensities and their interrelationships, but has been thus far limited to analyzing regions of interest. The proposed method provides a 3D statistical map comparing texture features on a voxel-by-voxel basis. The validity of the method was examined on artificially generated effects as well as on real MRI data in Alzheimer's Disease (AD). The artificially generated effects included hyperintense and hypointense signals added to T1-weighted brain MRIs from 30 healthy subjects. The AD dataset included 30 patients with AD and 30 age/sex matched healthy control subjects. The proposed method detected artificial effects with high accuracy and revealed statistically significant differences between the AD and control groups. This paper extends the usage of texture analysis beyond the current region of interest analysis to voxel-by-voxel 3D statistical mapping and provides a hypothesis-free analysis tool to study cerebral pathology in neurological diseases.

  19. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    Science.gov (United States)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  20. Forming Teams for Teaching Programming based on Static Code Analysis

    Directory of Open Access Journals (Sweden)

    Davis Arosemena-Trejos

    2012-03-01

    Full Text Available The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG. This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team€™s formations are based on programming styles, skills, pair programming or with leader.

  1. Climate policy decisions require policy-based lifecycle analysis.

    Science.gov (United States)

    Bento, Antonio M; Klotz, Richard

    2014-05-20

    Lifecycle analysis (LCA) metrics of greenhouse gas emissions are increasingly being used to select technologies supported by climate policy. However, LCAs typically evaluate the emissions associated with a technology or product, not the impacts of policies. Here, we show that policies supporting the same technology can lead to dramatically different emissions impacts per unit of technology added, due to multimarket responses to the policy. Using a policy-based consequential LCA, we find that the lifecycle emissions impacts of four US biofuel policies range from a reduction of 16.1 gCO2e to an increase of 24.0 gCO2e per MJ corn ethanol added by the policy. The differences between these results and representative technology-based LCA measures, which do not account for the policy instrument driving the expansion in the technology, illustrate the need for policy-based LCA measures when informing policy decision making.

  2. Forming Teams for Teaching Programming based on Static Code Analysis

    CERN Document Server

    Arosemena-Trejos, Davis; Clunie, Clifton

    2012-01-01

    The use of team for teaching programming can be effective in the classroom because it helps students to generate and acquire new knowledge in less time, but these groups to be formed without taking into account some respects, may cause an adverse effect on the teaching-learning process. This paper proposes a tool for the formation of team based on the semantics of source code (SOFORG). This semantics is based on metrics extracted from the preferences, styles and good programming practices. All this is achieved through a static analysis of code that each student develops. In this way, you will have a record of students with the information extracted; it evaluates the best formation of teams in a given course. The team's formations are based on programming styles, skills, pair programming or with leader.

  3. Least Dependent Component Analysis Based on Mutual Information

    CERN Document Server

    Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter

    2004-01-01

    We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...

  4. Statistical mechanics of light elements at high pressure. IV - A model free energy for the metallic phase. [for Jovian type planet interiors

    Science.gov (United States)

    Dewitt, H. E.; Hubbard, W. B.

    1976-01-01

    A large quantity of data on the thermodynamic properties of hydrogen-helium metallic liquids have been obtained in extended computer calculations in which a Monte Carlo code essentially identical to that described by Hubbard (1972) was used. A model free energy for metallic hydrogen with a relatively small mass fraction of helium is discussed, taking into account the definition of variables, a procedure for choosing the free energy, values for the fitting parameters, and the evaluation of the entropy constants. Possibilities concerning a use of the obtained data in studies of the interiors of the outer planets are briefly considered.

  5. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria

    2016-01-01

    -engineering efforts for scaling a system specification efficaciously. We demonstrate the value of our methodology by investigating a smartphone-based biosensing instrumentation platform. Specifically, we carry out scalability analysis for the system’s bandwidth specification: the maximum analog voltage waveform...

  6. Average Case Analysis of Some Elimination-Based Data-Flow Analysis Algorithms

    OpenAIRE

    2008-01-01

    The average case of some elimination-based data-flow analysis algorithms is analyzed in a mathematical way. Besides this allows for comparing the timing behavior of the algorithms, it also provides insights into how relevant the underlying statistics are when compared to practical settings.

  7. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow.

  8. Echo-waveform classification using model and model free techniques: Experimental study results from central western continental shelf of India

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.; Navelkar, G.S.; Desai, R.G.P.; Janakiraman, G.; Mahale, V.; Fernandes, W.A.; Rao, N.

    seafloor of India, but unable to provide a suitable means for seafloor classification. This paper also suggests a hybrid artificial neural network (ANN) architecture i.e. Learning Vector Quantisation (LVQ) for seafloor classification. An analysis...

  9. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  10. Analysis of Host-Based and Network-Based Intrusion Detection System

    Directory of Open Access Journals (Sweden)

    Amrit Pal Singh

    2014-07-01

    Full Text Available Intrusion-detection systems (IDS aim at de-tecting attacks against computer systems and networks or, in general, against information systems. Its basic aim is to protect the system against malwares and unauthorized access of a network or a system. Intrusion Detection is of two types Network-IDS and Host Based- IDS. This paper covers the scope of both the types and their result analysis along with their comparison as stated. OSSEC (HIDS is a free, open source host-base intrusion detection system. It performs log analysis, integrity checking, Windows registry monitoring, rootkit detection, time-based alerting and active response. While Snort (NIDS is a lightweight intrusion detection system that can log packets coming across your network and can alert the user regarding any attack. Both are efficient in their own distinct fields.

  11. Finite element analysis of osteoporosis models based on synchrotron radiation

    Science.gov (United States)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  12. Principle-based concept analysis: Caring in nursing education

    Science.gov (United States)

    Salehian, Maryam; Heydari, Abbas; Aghebati, Nahid; Moonaghi, Hossein Karimi; Mazloom, Seyed Reza

    2016-01-01

    Introduction The aim of this principle-based concept analysis was to analyze caring in nursing education and to explain the current state of the science based on epistemologic, pragmatic, linguistic, and logical philosophical principles. Methods A principle-based concept analysis method was used to analyze the nursing literature. The dataset included 46 English language studies, published from 2005 to 2014, and they were retrieved through PROQUEST, MEDLINE, CINAHL, ERIC, SCOPUS, and SID scientific databases. The key dimensions of the data were collected using a validated data-extraction sheet. The four principles of assessing pragmatic utility were used to analyze the data. The data were managed by using MAXQDA 10 software. Results The scientific literature that deals with caring in nursing education relies on implied meaning. Caring in nursing education refers to student-teacher interactions that are formed on the basis of human values and focused on the unique needs of the students (epistemological principle). The result of student-teacher interactions is the development of both the students and the teachers. Numerous applications of the concept of caring in nursing education are available in the literature (pragmatic principle). There is consistency in the meaning of the concept, as a central value of the faculty-student interaction (linguistic principle). Compared with other related concepts, such as “caring pedagogy,” “value-based education,” and “teaching excellence,” caring in nursing education does not have exact and clear conceptual boundaries (logic principle). Conclusion Caring in nursing education was identified as an approach to teaching and learning, and it is formed based on teacher-student interactions and sustainable human values. A greater understanding of the conceptual basis of caring in nursing education will improve the caring behaviors of teachers, create teaching-learning environments, and help experts in curriculum development

  13. Orbital Energy-Based Reaction Analysis of SN2 Reactions

    Directory of Open Access Journals (Sweden)

    Takao Tsuneda

    2016-07-01

    Full Text Available An orbital energy-based reaction analysis theory is presented as an extension of the orbital-based conceptual density functional theory. In the orbital energy-based theory, the orbitals contributing to reactions are interpreted to be valence orbitals giving the largest orbital energy variation from reactants to products. Reactions are taken to be electron transfer-driven when they provide small variations for the gaps between the contributing occupied and unoccupied orbital energies on the intrinsic reaction coordinates in the initial processes. The orbital energy-based theory is then applied to the calculations of several S N2 reactions. Using a reaction path search method, the Cl− + CH3I → ClCH3 + I− reaction, for which another reaction path called “roundabout path” is proposed, is found to have a precursor process similar to the roundabout path just before this SN2 reaction process. The orbital energy-based theory indicates that this precursor process is obviously driven by structural change, while the successor SN2 reaction proceeds through electron transfer between the contributing orbitals. Comparing the calculated results of the SN2 reactions in gas phase and in aqueous solution shows that the contributing orbitals significantly depend on solvent effects and these orbitals can be correctly determined by this theory.

  14. Relational Analysis based Concurrent Multipath Transfer over Heterogeneous Vehicular Networks

    Directory of Open Access Journals (Sweden)

    Hongke Zhang

    2012-09-01

    Full Text Available In recent years, the growing interest in the Intelligent Transportation Systems (ITS has resulted in variety of peer-reviewed publications. Significant results in this area have enabled many civilian and industry applications. As more and more vehicles are equipped with multiple network interfaces, how to efficient utilize the coexistence of Radio Access Technologies (RAT such as WiFi, UMTS and WiMAX to serve a best Concurrent Multipath Transfer (CMT service is still a challenge in ITS. In this paper, we propose GRA-CMT, a novel Grey Relational Analysis (GRA based Concurrent Multipath Transfer, extension for Stream Control Transport Protocol (SCTP. Depending on the advantages of GRA, a GRA-based Data Distribution algorithm is proposed in GRA-CMT to calculate the Grey Relational Coefficient (GRC value of all candidate paths and offer a more efficient data scheduling algorithm, a further proposed GRA-based CMT Retransmission algorithm devotes to select destination for efficient retransmission. Moreover, the GRA-CMT provides a GRA-based CMT Path Selection scheme to manage candidate paths. Sufficient simulation results obtained by a close realistic simulation topology show how GRA-CMT outperforms existing CMT in heterogeneous SCTP-based vehicular networks.

  15. Weight measurement using image-based pose analysis

    Institute of Scientific and Technical Information of China (English)

    Hong Zhang; Kui Zhang; Ying Mu; Ning Yao; Robert J. Sclabassi; Mingui Sun

    2008-01-01

    Image-based gait analysis as a means of biometric identification has attracted much research attention.Most of the existing methods focus on human identification,posture analysis and movement tracking.There have been few investigations on measuring the carried load based on the carrier's gait characteristics by automatic image processing.Nevertheless,this measurement is very useful in a number of applications,such as the study of the carried load on the postural development of children and adolescence.In this paper,we inves-tigate how to automatically estimate the carried weight from a sequence of images.We present a method to extract human gait silhouette based on an observation that humans tend to minimize the energy during motion.We compute several angles of body leaning and deter-mine the relationship of the carried weight,the leaning angles and the centroid location according to a human kinetic study.Our weight determination method has been verified successfully by experiments.

  16. ALGORITHMS FOR TENNIS RACKET ANALYSIS BASED ON MOTION DATA

    Directory of Open Access Journals (Sweden)

    Maria Skublewska-Paszkowska

    2016-09-01

    Full Text Available Modern technologies, such as motion capture systems (both optical and markerless, are more and more frequently used for athlete performance analysis due to their great precision. Optical systems based on the retro-reflective markers allow for tracking motion of multiple objects of various types. These systems compute human kinetic and kinematic parameters based on biomechanical models. Tracking additional objects like a tennis racket is also a very important aspect for analysing the player’s technique and precision. The motion data gathered by motion capture systems may be used for analysing various aspects that may not be recognised by the human eye or a video camera. This paper presents algorithms for analysis of a tennis racket motion during two of the most important tennis strokes: forehand and backhand. An optical Vicon system was used for obtaining the motion data which was the input for the algorithms. They indicate: the velocity of a tennis racket’s head and the racket’s handle based on the trajectories of attached markers as well as the racket’s orientation. The algorithms were implemented and tested on the data obtained from a professional trainer who participated in the research and performed a series of ten strikes, separately for: 1 forehand without a ball, 2 backhand without a ball, 3 forehand with a ball and 4 backhand with a ball. The computed parameters are gathered in tables and visualised in a graph.

  17. Medical diagnostics by laser-based analysis of exhaled breath

    Science.gov (United States)

    Giubileo, Gianfranco

    2002-08-01

    IMany trace gases can be found in the exhaled breath, some of them giving the possibility of a non invasive diagnosis of related diseases or allowing the monitoring of the disease in the course of its therapy. In the present lecture the principle of medical diagnosis based on the breath analysis will be introduced and the detection of trace gases in exhaled breath by high- resolution molecular spectroscopy in the IR spectral region will be discussed. A number of substrates and the optical systems for their laser detection will be reported. The following laser based experimental systems has been realised in the Molecular Spectroscopy Laboratory in ENEA in Frascati for the analysis of specific substances in the exhaled breath. A tuneable diode laser absorption spectroscopy (TDLAS) appartus for the measurement of 13C/12C isotopic ratio in carbon dioxide, a TDLAS apparatus for the detection of CH4 and a CO2 laser based photoacoustic system to detect trace ethylene at atmospheric pressure. The experimental set-up for each one of the a.m. optical systems will be shown and the related medical applications will be illustrated. The concluding remarks will be focuses on chemical species that are of major interest for medical people today and their diagnostic ability.

  18. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  19. SMV model-based safety analysis of software requirements

    Energy Technology Data Exchange (ETDEWEB)

    Koh, Kwang Yong [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 373-1, Guseong-dong, Yuseong-gu, Daejeon 305-701 (Korea, Republic of)], E-mail: phseong@kaist.ac.kr

    2009-02-15

    Fault tree analysis (FTA) is one of the most frequently applied safety analysis techniques when developing safety-critical industrial systems such as software-based emergency shutdown systems of nuclear power plants and has been used for safety analysis of software requirements in the nuclear industry. However, the conventional method for safety analysis of software requirements has several problems in terms of correctness and efficiency; the fault tree generated from natural language specifications may contain flaws or errors while the manual work of safety verification is very labor-intensive and time-consuming. In this paper, we propose a new approach to resolve problems of the conventional method; we generate a fault tree from a symbolic model verifier (SMV) model, not from natural language specifications, and verify safety properties automatically, not manually, by a model checker SMV. To demonstrate the feasibility of this approach, we applied it to shutdown system 2 (SDS2) of Wolsong nuclear power plant (NPP). In spite of subtle ambiguities present in the approach, the results of this case study demonstrate its overall feasibility and effectiveness.

  20. Psychoacoustic Music Analysis Based on the Discrete Wavelet Packet Transform

    Directory of Open Access Journals (Sweden)

    Xing He

    2008-01-01

    Full Text Available Psychoacoustical computational models are necessary for the perceptual processing of acoustic signals and have contributed significantly in the development of highly efficient audio analysis and coding. In this paper, we present an approach for the psychoacoustic analysis of musical signals based on the discrete wavelet packet transform. The proposed method mimics the multiresolution properties of the human ear closer than other techniques and it includes simultaneous and temporal auditory masking. Experimental results show that this method provides better masking capabilities and it reduces the signal-to-masking ratio substantially more than other approaches, without introducing audible distortion. This model can lead to greater audio compression by permitting further bit rate reduction and more secure watermarking by providing greater signal space for information hiding.

  1. Ground extraction from airborne laser data based on wavelet analysis

    Science.gov (United States)

    Xu, Liang; Yang, Yan; Jiang, Bowen; Li, Jia

    2007-11-01

    With the advantages of high resolution and accuracy, airborne laser scanning data are widely used in topographic mapping. In order to generate a DTM, measurements from object features such as buildings, vehicles and vegetation have to be classified and removed. However, the automatic extraction of bare earth from point clouds acquired by airborne laser scanning equipment remains a problem in LIDAR data filtering nowadays. In this paper, a filter algorithm based on wavelet analysis is proposed. Relying on the capability of detecting discontinuities of continuous wavelet transform and the feature of multi-resolution analysis, the object points can be removed, while ground data are preserved. In order to evaluate the performance of this approach, we applied it to the data set used in the ISPRS filter test in 2003. 15 samples have been tested by the proposed approach. Results showed that it filtered most of the objects like vegetation and buildings, and extracted a well defined ground model.

  2. Carbon nanotube based VLSI interconnects analysis and design

    CERN Document Server

    Kaushik, Brajesh Kumar

    2015-01-01

    The brief primarily focuses on the performance analysis of CNT based interconnects in current research scenario. Different CNT structures are modeled on the basis of transmission line theory. Performance comparison for different CNT structures illustrates that CNTs are more promising than Cu or other materials used in global VLSI interconnects. The brief is organized into five chapters which mainly discuss: (1) an overview of current research scenario and basics of interconnects; (2) unique crystal structures and the basics of physical properties of CNTs, and the production, purification and applications of CNTs; (3) a brief technical review, the geometry and equivalent RLC parameters for different single and bundled CNT structures; (4) a comparative analysis of crosstalk and delay for different single and bundled CNT structures; and (5) various unique mixed CNT bundle structures and their equivalent electrical models.

  3. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  4. Multiwave velocity analysis based on Gaussian beam prestack depth migration

    Institute of Scientific and Technical Information of China (English)

    Han Jian-Guang; Wang Yun; Han Ning; Xing Zhan-Tao; Lu Jun

    2014-01-01

    Prestack depth migration of multicomponent seismic data improves the imaging accuracy of subsurface complex geological structures. An accurate velocityfi eld is critical to accurate imaging. Gaussian beam migration was used to perform multicomponent migration velocity analysis of PP- and PS-waves. First, PP- and PS-wave Gaussian beam prestack depth migration algorithms that operate on common-offset gathers are presented to extract offset-domain common-image gathers of PP- and PS-waves. Second, based on the residual moveout equation, the migration velocity fields of P- and S-waves are updated. Depth matching is used to ensure that the depth of the target layers in the PP- and PS-wave migration profi les are consistent, and high-precision P- and S-wave velocities are obtained. Finally, synthetic andfi eld seismic data suggest that the method can be used effectively in multiwave migration velocity analysis.

  5. Bayesian Model Selection with Network Based Diffusion Analysis.

    Science.gov (United States)

    Whalen, Andrew; Hoppitt, William J E

    2016-01-01

    A number of recent studies have used Network Based Diffusion Analysis (NBDA) to detect the role of social transmission in the spread of a novel behavior through a population. In this paper we present a unified framework for performing NBDA in a Bayesian setting, and demonstrate how the Watanabe Akaike Information Criteria (WAIC) can be used for model selection. We present a specific example of applying this method to Time to Acquisition Diffusion Analysis (TADA). To examine the robustness of this technique, we performed a large scale simulation study and found that NBDA using WAIC could recover the correct model of social transmission under a wide range of cases, including under the presence of random effects, individual level variables, and alternative models of social transmission. This work suggests that NBDA is an effective and widely applicable tool for uncovering whether social transmission underpins the spread of a novel behavior, and may still provide accurate results even when key model assumptions are relaxed.

  6. Dynamic analysis of granite rockburst based on the PIV technique

    Institute of Scientific and Technical Information of China (English)

    Wang Hongjian; Liu Da’an; Gong Weili; Li Liyun

    2015-01-01

    This paper describes the deep rockburst simulation system to reproduce the granite instantaneous rock-burst process. Based on the PIV (Particle Image Velocimetry) technique, quantitative analysis of a rock-burst, the images of tracer particle, displacement and strain fields can be obtained, and the debris trajectory described. According to the observation of on-site tests, the dynamic rockburst is actually a gas–solid high speed flow process, which is caused by the interaction of rock fragments and surrounding air. With the help of analysis on high speed video and PIV images, the granite rockburst failure process is composed of six stages of platey fragment spalling and debris ejection. Meanwhile, the elastic energy for these six stages has been calculated to study the energy variation. The results indicate that the rockburst process can be summarized as:an initiating stage, intensive developing stage and gradual decay stage. This research will be helpful for our further understanding of the rockburst mechanism.

  7. Analysis of structural macroeconomic indicators based on harmony approach

    Directory of Open Access Journals (Sweden)

    Knyshenko, Tetyana

    2012-05-01

    Full Text Available In the article the application of fractal theory and proportions of «gold section» is considered to the analysis of macroeconomic indicators, the criteria of optimum and efficiency of structure of economy are selected. An economy is represented as treelike fractal, every level of which is characterized by potential of profits (by the issue and necessity in charges. The types of structural subsystems in an economy are selected: permanent and temporal, the last arise up at violation of systems optimum. An author drew conclusion that on the certain types of economic activity structural subsystems can be effective, but they can not be named optimum, as there are deviations from the «gold section» at the certain type of the technological mode, in the issue, intermediate consumption and GVP. A conclusion about the type of the technological mode based on analysis of investment of spheres and types of economic activity in forming of macroeconomic indexes.

  8. Analysis and design of a smart card based authentication protocol

    Institute of Scientific and Technical Information of China (English)

    Kuo-Hui YEH; Kuo-Yu TSAI; Jia-Li HOU

    2013-01-01

    Numerous smart card based authentication protocols have been proposed to provide strong system security and robust individual privacy for communication between parties these days. Nevertheless, most of them do not provide formal analysis proof, and the security robustness is doubtful. Chang and Cheng (2011) proposed an efficient remote authentication protocol with smart cards and claimed that their proposed protocol could support secure communication in a multi-server environment. Unfortunately, there are opportunities for security enhancement in current schemes. In this paper, we identify the major weakness, i.e., session key disclosure, of a recently published protocol. We consequently propose a novel authentication scheme for a multi-server envi-ronment and give formal analysis proofs for security guarantees.

  9. Seismic Base Isolation Analysis for PASCAR Liquid Metal Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Kuk Hee; Yoo, Bong; Kim, Yun Jae [Korea Univ., Seoul (Korea, Republic of)

    2008-10-15

    This paper presents a study for developing a seismic isolation system for the PASCAR (Proliferation resistant, Accident-tolerant, Self-supported, Capsular and Assured Reactor) liquid metal reactor design. PASCAR use lead-bismuth eutectic (LBE) as coolant. Because the density (10,000kg/m{sup 3}) of LBE coolant is very heavier than sodium coolant and water, this presents a challenge to designers of the seismic isolation systems that will be used with these heavy liquid metal reactors. Finite element analysis is adapted to determine the characteristics of the isolator device. Results are presented from a study on the use of three-dimensional seismic isolation devices to the full-scale reactor. The seismic analysis responses of the two-dimensional and the three-dimensional isolation systems for the PASCAR are compared with that of the conventional fixed base system.

  10. Kernel-based fisher discriminant analysis for hyperspectral target detection

    Institute of Scientific and Technical Information of China (English)

    GU Yan-feng; ZHANG Ye; YOU Di

    2007-01-01

    A new method based on kernel Fisher discriminant analysis (KFDA) is proposed for target detection of hyperspectral images. The KFDA combines kernel mapping derived from support vector machine and the classical linear Fisher discriminant analysis (LFDA), and it possesses good ability to process nonlinear data such as hyperspectral images. According to the Fisher rule that the ratio of the between-class and within-class scatters is maximized, the KFDA is used to obtain a set of optimal discriminant basis vectors in high dimensional feature space. All pixels in the hyperspectral images are projected onto the discriminant basis vectors and the target detection is performed according to the projection result. The numerical experiments are performed on hyperspectral data with 126 bands collected by Airborne Visible/Infrared Imaging Spectrometer (AVIRIS). The experimental results show the effectiveness of the proposed detection method and prove that this method has good ability to overcome small sample size and spectral variability in the hyperspectral target detection.

  11. Earthquake Analysis of Structure by Base Isolation Technique in SAP

    Directory of Open Access Journals (Sweden)

    T. Subramani

    2014-06-01

    Full Text Available This paper presents an overview of the present state of base isolation techniques with special emphasis and a brief on other techniques developed world over for mitigating earthquake forces on the structures. The dynamic analysis procedure for isolated structures is briefly explained. The provisions of FEMA 450 for base isolated structures are highlighted. The effects of base isolation on structures located on soft soils and near active faults are given in brief. Simple case study on natural base isolation using naturally available soils is presented. Also, the future areas of research are indicated. Earthquakes are one of nature IS greatest hazards; throughout historic time they have caused significant loss offline and severe damage to property, especially to man-made structures. On the other hand, earthquakes provide architects and engineers with a number of important design criteria foreign to the normal design process. From well established procedures reviewed by many researchers, seismic isolation may be used to provide an effective solution for a wide range of seismic design problems. The application of the base isolation techniques to protect structures against damage from earthquake attacks has been considered as one of the most effective approaches and has gained increasing acceptance during the last two decades. This is because base isolation limits the effects of the earthquake attack, a flexible base largely decoupling the structure from the ground motion, and the structural response accelerations are usually less than the ground acceleration. In general, the increase of additional viscous damping in the structure may reduce displacement and acceleration responses of the structure. This study also seeks to evaluate the effects of additional damping on the seismic response when compared with structures without additional damping for the different ground motions.

  12. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  13. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  14. Pyemu: A Python-Based Framework for Linear-Based Model Uncertainty Analysis

    Science.gov (United States)

    White, J.

    2014-12-01

    pyEMU is an open-source python-based framework for model-independent linear-based parameter and predictive uncertainty analysis. The framework is designed to support the analysis of high-dimensional inverse problems that have thousands of parameters and hundreds of thousands of observations. The code is compatible with the PEST and PEST++ software suite, and implements several forms of linear analysis equations, such as Schur's complement for conditional uncertainty propagation and subspace error variance, including a form of error variance analysis of model structural error. These linear analysis equations are the most common and also the most applicable to large-scale environmental models. Several native python operators (such as multiplication, subtraction, addition, exponentiation) have been overloaded to make equation building more concise as well as to achieve speedup with operations involving diagonal matrices. To help ensure pyEMU is intuitive and easy to use, emphasis was placed on flexibility and concise object instantiation. As a result, several types of arguments can be handled elegantly.

  15. Web-Based Instruction and Learning: Analysis and Needs Assessment

    Science.gov (United States)

    Grabowski, Barbara; McCarthy, Marianne; Koszalka, Tiffany

    1998-01-01

    An analysis and needs assessment was conducted to identify kindergarten through grade 14 (K-14) customer needs with regard to using the World Wide Web (WWW) for instruction and to identify obstacles K-14 teachers face in utilizing NASA Learning Technologies products in the classroom. The needs assessment was conducted as part of the Dryden Learning Technologies Project which is a collaboration between Dryden Flight Research Center (DFRC), Edwards, California and Tne Pennsylvania State University (PSU), University Park, Pennsylvania. The overall project is a multiyear effort to conduct research in the development of teacher training and tools for Web-based science, mathematics and technology instruction and learning.

  16. Stellar Population Analysis of Galaxies based on Genetic Algorithms

    Institute of Scientific and Technical Information of China (English)

    Abdel-Fattah Attia; H.A.Ismail; I.M.Selim; A.M.Osman; I.A.Isaa; M.A.Marie; A.A.Shaker

    2005-01-01

    We present a new method for determining the age and relative contribution of different stellar populations in galaxies based on the genetic algorithm.We apply this method to the barred spiral galaxy NGC 3384, using CCD images in U, B, V, R and I bands. This analysis indicates that the galaxy NGC 3384 is mainly inhabited by old stellar population (age > 109 yr). Some problems were encountered when numerical simulations are used for determining the contribution of different stellar populations in the integrated color of a galaxy. The results show that the proposed genetic algorithm can search efficiently through the very large space of the possible ages.

  17. A Developed Algorithm of Apriori Based on Association Analysis

    Institute of Scientific and Technical Information of China (English)

    LI Pingxiang; CHEN Jiangping; BIAN Fuling

    2004-01-01

    A method for mining frequent itemsets by evaluating their probability of supports based on association analysis is presented. This paper obtains the probability of every 1-itemset by scanning the database, then evaluates the probability of every 2-itemset, every 3-itemset, every k-itemset from the frequent 1-itemsets and gains all the candidate frequent itemsets. This paper also scans the database for verifying the support of the candidate frequent itemsets. Last, the frequent itemsets are mined. The method reduces a lot of time of scanning database and shortens the computation time of the algorithm.

  18. Image edge detection based on multi-fractal spectrum analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Shao-yuan; WANG Yao-nan

    2006-01-01

    In this paper,an image edge detection method based on multi-fractal spectrum analysis is presented.The coarse grain H(o)lder exponent of the image pixels is first computed,then,its multi-fractal spectrum is estimated by the kernel estimation method.Finally,the image edge detection is done by means of different multi-fractal spectrum values.Simulation results show that this method is efficient and has better locality compared with the traditional edge detection methods such as the Sobel method.

  19. Virtual estimator for piecewise linear systems based on observability analysis.

    Science.gov (United States)

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés, Luis G; Beltrán, Carlos Daniel García

    2013-02-27

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results.

  20. Cloud for Distributed Data Analysis Based on the Actor Model

    Directory of Open Access Journals (Sweden)

    Ivan Kholod

    2016-01-01

    Full Text Available This paper describes the construction of a Cloud for Distributed Data Analysis (CDDA based on the actor model. The design uses an approach to map the data mining algorithms on decomposed functional blocks, which are assigned to actors. Using actors allows users to move the computation closely towards the stored data. The process does not require loading data sets into the cloud and allows users to analyze confidential information locally. The results of experiments show that the efficiency of the proposed approach outperforms established solutions.

  1. Differentiation-Based Analysis of Environmental Management and Corporate Performance

    Institute of Scientific and Technical Information of China (English)

    SHAN Dong-ming; MU Xin

    2007-01-01

    By building a duopoly model based on product differentiation, both of the clean firm's and the dirty firm's performances are studied under the assumptions that consumers have different preferences for the product environmental attributes, and that the product cost increases with the environmental attribute. The analysis results show that under either the case with no environmental regulation or that with a tariff levied on the dirty product, the clean firm would always get more profit. In addition, the stricter the regulation is, the more profit the clean firm would obtain. This can verify that from the view of product differentiation, a firm could improve its corporate competitiveness with environmental management.

  2. Machine Learning for Vision-Based Motion Analysis

    CERN Document Server

    Wang, Liang; Cheng, Li; Pietikainen, Matti

    2011-01-01

    Techniques of vision-based motion analysis aim to detect, track, identify, and generally understand the behavior of objects in image sequences. With the growth of video data in a wide range of applications from visual surveillance to human-machine interfaces, the ability to automatically analyze and understand object motions from video footage is of increasing importance. Among the latest developments in this field is the application of statistical machine learning algorithms for object tracking, activity modeling, and recognition. Developed from expert contributions to the first and second In

  3. GIS based analysis of future district heating potential in Denmark

    DEFF Research Database (Denmark)

    Nielsen, Steffen; Möller, Bernd

    2013-01-01

    The physical placement of buildings is important when determining the potential for DH (district heating). Good locations for DH are mainly determined by having both a large heat demand within a certain area and having access to local heat resources. In recent years, the locations of buildings...... in Denmark have been mapped in a heat atlas which includes all buildings and their heat demands. This article focuses on developing a method for assessing the costs associated with supplying these buildings with DH. The analysis is based on the existing DH areas in Denmark. By finding the heat production...

  4. Scale-Dependent Representations of Relief Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    Automatic generalization of geographic information is the core of multi-scale representation of spatial data,but the scale-dependent generalization methods are far from abundant because of its extreme complicacy.This paper puts forward a new consistency model about scale-dependent representations of relief based on wavelet analysis,and discusses the thresholds in the model so as to acquire the continual representations of relief with different details between scales.The model not only meets the need of automatic generalization but also is scale-dependent completely.Some practical examples are given.

  5. Facility Layout Based on Sequence Analysis: Design of Flowshops

    Institute of Scientific and Technical Information of China (English)

    ZHOU Jin; WU Zhi-ming

    2009-01-01

    A computer-aided method to design a hybrid layout-tree-shape planar flowlines is presented. In new-type flowshop layout, the common machines shared by several flowlines could be located together in functional sections. The approach combines traditional cell formation techniques with sequence alignment algorithms. Firstly, a sequence analysis based cell formation procedure is adopted; then the operation sequences for parts are aligned to maximize machines adjacency in hyperedge representations; finally a tree-shape planar flowline will be obtained for each part family. With the help of a sample of operation sequences obtained from industry, this algorithm is illustrated.

  6. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  7. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Science.gov (United States)

    Morales-Morales, Cornelio; Adam-Medina, Manuel; Cervantes, Ilse; Vela-Valdés and, Luis G.; García Beltrán, Carlos Daniel

    2013-01-01

    This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system's outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results. PMID:23447007

  8. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...... contents and the requirement for the project prior to its start are described together with thee results obtained during the 3 year period of the project. The project was mainly carried out as a Ph.D project by the first author from September 1994 to August 1997 in cooperation with associate professor Rune...

  9. Toward a validation process for model based safety analysis

    OpenAIRE

    Adeline, Romain; Cardoso, Janette; Darfeuil, Pierre; Humbert, Sophie; Seguin, Christel

    2010-01-01

    Today, Model Based processes become more and more widespread to achieve the analysis of a system. However, there is no formal testing approach to ensure that the formal model is compliant with the real system. In the paper, we choose to study AltaRica model. We present a general process to well construct and validate an AltaRica formal model. The focus is made on this validation phase, i.e. verifying the compliance between the model and the real system. For it, the proposed process recommends...

  10. Reliability Distribution of Numerical Control Lathe Based on Correlation Analysis

    Institute of Scientific and Technical Information of China (English)

    Xiaoyan Qi; Guixiang Shen; Yingzhi Zhang; Shuguang Sun; Bingkun Chen

    2016-01-01

    Combined Reliability distribution with correlation analysis, a new method has been proposed to make Reliability distribution where considering the elements about structure correlation and failure correlation of subsystems. Firstly, we make a sequence for subsystems by means of TOPSIS which comprehends the considerations of Reliability allocation, and introducing a Copula connecting function to set up a distribution model based on structure correlation, failure correlation and target correlation, and then acquiring reliability target area of all subsystems by Matlab. In this method, not only the traditional distribution considerations are concerned, but also correlation influences are involved, to achieve supplementing information and optimizing distribution.

  11. Operational logs analysis at ALMA observatory based on ELK stack

    Science.gov (United States)

    Gil, Juan P.; Reveco, Johnny; Shen, Tzu-Chiang

    2016-07-01

    During operations, the ALMA observatory generates a huge amount of logs which contain not only valuable information related to specific failures but also for long term performance analysis. We implemented a big data solution based on Elasticsearch, Logstash and Kibana. They are configured as decoupled system which causes zero impact on the existent operations. It is able to keep more than six months of operation logs online. In this paper, we'll describe this infrastructure, applications built on top of it, and the problems that we faced during its implementation.

  12. Sensitivity Analysis of a Bioinspired Refractive Index Based Gas Sensor

    Institute of Scientific and Technical Information of China (English)

    Yang Gao; Qi Xia; Guanglan Liao; Tielin Shi

    2011-01-01

    It was found out that the change of refractive index of ambient gas can lead to obvious change of the color of Morpho butterfly's wing. Such phenomenon has been employed as a sensing principle for detecting gas. In the present study, Rigorous Coupled-Wave Analysis (RCWA) was described briefly, and the partial derivative of optical reflection efficiency with respect to the refractive index of ambient gas, i.e., sensitivity of the sensor, was derived based on RCWA. A bioinspired grating model was constructed by mimicking the nanostructure on the ground scale of Morpho didius butterfly's wing. The analytical sensitivity was verified and the effect of the grating shape on the reflection spectra and its sensitivity were discussed. The results show that by tuning shape parameters of the grating, we can obtain desired reflection spectra and sensitivity, which can be applied to the design of the bioinspired refractive index based gas sensor.

  13. BLAT-based comparative analysis for transposable elements: BLATCAT.

    Science.gov (United States)

    Lee, Sangbum; Oh, Sumin; Kang, Keunsoo; Han, Kyudong

    2014-01-01

    The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs) is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT) based comparative analysis for transposable elements (BLATCAT) program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque) on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes.

  14. BLAT-Based Comparative Analysis for Transposable Elements: BLATCAT

    Directory of Open Access Journals (Sweden)

    Sangbum Lee

    2014-01-01

    Full Text Available The availability of several whole genome sequences makes comparative analyses possible. In primate genomes, the priority of transposable elements (TEs is significantly increased because they account for ~45% of the primate genomes, they can regulate the gene expression level, and they are associated with genomic fluidity in their host genomes. Here, we developed the BLAST-like alignment tool (BLAT based comparative analysis for transposable elements (BLATCAT program. The BLATCAT program can compare specific regions of six representative primate genome sequences (human, chimpanzee, gorilla, orangutan, gibbon, and rhesus macaque on the basis of BLAT and simultaneously carry out RepeatMasker and/or Censor functions, which are widely used Windows-based web-server functions to detect TEs. All results can be stored as a HTML file for manual inspection of a specific locus. BLATCAT will be very convenient and efficient for comparative analyses of TEs in various primate genomes.

  15. Discrete directional wavelet bases and frames: analysis and applications

    Science.gov (United States)

    Dragotti, Pier Luigi; Velisavljevic, Vladan; Vetterli, Martin; Beferull-Lozano, Baltasar

    2003-11-01

    The application of the wavelet transform in image processing is most frequently based on a separable construction. Lines and columns in an image are treated independently and the basis functions are simply products of the corresponding one dimensional functions. Such method keeps simplicity in design and computation, but is not capable of capturing properly all the properties of an image. In this paper, a new truly separable discrete multi-directional transform is proposed with a subsampling method based on lattice theory. Alternatively, the subsampling can be omitted and this leads to a multi-directional frame. This transform can be applied in many areas like denoising, non-linear approximation and compression. The results on non-linear approximation and denoising show interesting gains compared to the standard two-dimensional analysis.

  16. Supermarket Analysis Based On Product Discount and Statistics

    Directory of Open Access Journals (Sweden)

    Komal Kumawat

    2014-03-01

    Full Text Available E-commerce has been growing rapidly. Its domain can provide all the right ingredients for successful data mining and it is a significant domain of data mining. E commerce refers to buying and selling of products or services over electronic systems such as internet. Various e commerce systems give discount on product and allow user to buy product online. The basic idea used here is to predict the product sale based on discount applied to the product. Our analysis concentrates on how customer behaves when discount is allotted to him. We have developed a model which finds the customer behaviour when discount is applied to the product. This paper elaborates upon how a different technique like session, click stream is used to collect user data online based on discount applied to the product and how statistics is applied to data set to see the variation in the data.

  17. Single base pair mutation analysis by PNA directed PCR clamping

    DEFF Research Database (Denmark)

    Ørum, H.; Nielsen, P.E.; Egholm, M.;

    1993-01-01

    A novel method that allows direct analysis of single base mutation by the polymerase chain reaction (PCR) is described. The method utilizes the finding that PNAs (peptide nucleic acids) recognize and bind to their complementary nucleic acid sequences with higher thermal stability and specificity...... than the corresponding deoxyribooligonucleotides and that they cannot function as primers for DNA polymerases. We show that a PNA/DNA complex can effectively block the formation of a PCR product when the PNA is targeted against one of the PCR primer sites. Furthermore, we demonstrate that this blockage...... allows selective amplification/suppression of target sequences that differ by only one base pair. Finally we show that PNAs can be designed in such a way that blockage can be accomplished when the PNA target sequence is located between the PCR primers....

  18. Sensitivity analysis of GSI based mechanical characterization of rock mass

    CERN Document Server

    Ván, P

    2012-01-01

    Recently, the rock mechanical and rock engineering designs and calculations are frequently based on Geological Strength Index (GSI) method, because it is the only system that provides a complete set of mechanical properties for design purpose. Both the failure criteria and the deformation moduli of the rock mass can be calculated with GSI based equations, which consists of the disturbance factor, as well. The aim of this paper is the sensitivity analysis of GSI and disturbance factor dependent equations that characterize the mechanical properties of rock masses. The survey of the GSI system is not our purpose. The results show that the rock mass strength calculated by the Hoek-Brown failure criteria and both the Hoek-Diederichs and modified Hoek-Diederichs deformation moduli are highly sensitive to changes of both the GSI and the D factor, hence their exact determination is important for the rock engineering design.

  19. BASE-9: Bayesian Analysis for Stellar Evolution with nine variables

    Science.gov (United States)

    Robinson, Elliot; von Hippel, Ted; Stein, Nathan; Stenning, David; Wagner-Kaiser, Rachel; Si, Shijing; van Dyk, David

    2016-08-01

    The BASE-9 (Bayesian Analysis for Stellar Evolution with nine variables) software suite recovers star cluster and stellar parameters from photometry and is useful for analyzing single-age, single-metallicity star clusters, binaries, or single stars, and for simulating such systems. BASE-9 uses a Markov chain Monte Carlo (MCMC) technique along with brute force numerical integration to estimate the posterior probability distribution for the age, metallicity, helium abundance, distance modulus, line-of-sight absorption, and parameters of the initial-final mass relation (IFMR) for a cluster, and for the primary mass, secondary mass (if a binary), and cluster probability for every potential cluster member. The MCMC technique is used for the cluster quantities (the first six items listed above) and numerical integration is used for the stellar quantities (the last three items in the above list).

  20. Support vector classifier based on principal component analysis

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Support vector classifier (SVC) has the superior advantages for small sample learning problems with high dimensions,with especially better generalization ability.However there is some redundancy among the high dimensions of the original samples and the main features of the samples may be picked up first to improve the performance of SVC.A principal component analysis (PCA) is employed to reduce the feature dimensions of the original samples and the pre-selected main features efficiently,and an SVC is constructed in the selected feature space to improve the learning speed and identification rate of SVC.Furthermore,a heuristic genetic algorithm-based automatic model selection is proposed to determine the hyperparameters of SVC to evaluate the performance of the learning machines.Experiments performed on the Heart and Adult benchmark data sets demonstrate that the proposed PCA-based SVC not only reduces the test time drastically,but also improves the identify rates effectively.

  1. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...... foreground camouflage, shadows, and moving backgrounds. The method continuously updates the background model to maintain high quality segmentation over long periods of time. Within action recognition the thesis presents work on both recognition of arm gestures and gait types. A key-frame based approach...... range of gait which deals with an inherent ambiguity of gait types. Human pose estimation does not target a specific action but is considered as a good basis for the recognition of any action. The pose estimation work presented in this thesis is mainly concerned with the problems of interacting people...

  2. State Inspection for Transmission Lines Based on Independent Component Analysis

    Institute of Scientific and Technical Information of China (English)

    REN Li-jia; JIANG Xiu-chen; SHENG Ge-hao; YANG Wei-wei

    2009-01-01

    Monitoring transmission towers is of great importance to prevent severe thefts on them and ensure the reliability and safety of the power grid operation. Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate statistical data based on dimension reduction methods, and it is applicable to extract the non-stationary signals. FastICA based on negentropy is presented to effectively extract and separate the vibration signals caused by human activity in this paper. A new method combined empirical mode decomposition (EMD) technique with the adaptive threshold method is applied to extract the vibration pulses, and suppress the interference signals. The practical tests demonstrate that the method proposed in the paper is effective in separating and extracting the vibration signals.

  3. A Novel Quantitative Analysis Model for Information System Survivability Based on Conflict Analysis

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; WANG Huiqiang; ZHAO Guosheng

    2007-01-01

    This paper describes a novel quantitative analysis model for system survivability based on conflict analysis, which provides a direct-viewing survivable situation. Based on the three-dimensional state space of conflict, each player's efficiency matrix on its credible motion set can be obtained. The player whose desire is the strongest in all initiates the moving and the overall state transition matrix of information system may be achieved. In addition, the process of modeling and stability analysis of conflict can be converted into a Markov analysis process, thus the obtained results with occurring probability of each feasible situation will help the players to quantitatively judge the probability of their pursuing situations in conflict. Compared with the existing methods which are limited to post-explanation of system's survivable situation, the proposed model is relatively suitable for quantitatively analyzing and forecasting the future development situation of system survivability. The experimental results show that the model may be effectively applied to quantitative analysis for survivability. Moreover, there will be a good application prospect in practice.

  4. An Exponential Class of Model-Free Visual Servoing Controllers in the Presence of Uncertain Camera Calibration

    Science.gov (United States)

    2003-12-01

    Chaumette, “Theoretical Improvements in the Stability Analysis of a New C lass of M odel-Free V isual Servoing M ethods,” IEEE Transactions on Robotics and...Malis, F . Chaumette, and S. Bodet, “2 1/2 D Visual Servoing,” IEEE Transactions on Robotics and Automation , Vol. 15 , No. 2 , pp. 238-250, (1999

  5. Data Clustering Analysis Based on Wavelet Feature Extraction

    Institute of Scientific and Technical Information of China (English)

    QIANYuntao; TANGYuanyan

    2003-01-01

    A novel wavelet-based data clustering method is presented in this paper, which includes wavelet feature extraction and cluster growing algorithm. Wavelet transform can provide rich and diversified information for representing the global and local inherent structures of dataset. therefore, it is a very powerful tool for clustering feature extraction. As an unsupervised classification, the target of clustering analysis is dependent on the specific clustering criteria. Several criteria that should be con-sidered for general-purpose clustering algorithm are pro-posed. And the cluster growing algorithm is also con-structed to connect clustering criteria with wavelet fea-tures. Compared with other popular clustering methods,our clustering approach provides multi-resolution cluster-ing results,needs few prior parameters, correctly deals with irregularly shaped clusters, and is insensitive to noises and outliers. As this wavelet-based clustering method isaimed at solving two-dimensional data clustering prob-lem, for high-dimensional datasets, self-organizing mapand U-matrlx method are applied to transform them intotwo-dimensional Euclidean space, so that high-dimensional data clustering analysis,Results on some sim-ulated data and standard test data are reported to illus-trate the power of our method.

  6. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  7. Reliability analysis of cluster-based ad-hoc networks

    Energy Technology Data Exchange (ETDEWEB)

    Cook, Jason L. [Quality Engineering and System Assurance, Armament Research Development Engineering Center, Picatinny Arsenal, NJ (United States); Ramirez-Marquez, Jose Emmanuel [School of Systems and Enterprises, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 (United States)], E-mail: Jose.Ramirez-Marquez@stevens.edu

    2008-10-15

    The mobile ad-hoc wireless network (MAWN) is a new and emerging network scheme that is being employed in a variety of applications. The MAWN varies from traditional networks because it is a self-forming and dynamic network. The MAWN is free of infrastructure and, as such, only the mobile nodes comprise the network. Pairs of nodes communicate either directly or through other nodes. To do so, each node acts, in turn, as a source, destination, and relay of messages. The virtue of a MAWN is the flexibility this provides; however, the challenge for reliability analyses is also brought about by this unique feature. The variability and volatility of the MAWN configuration makes typical reliability methods (e.g. reliability block diagram) inappropriate because no single structure or configuration represents all manifestations of a MAWN. For this reason, new methods are being developed to analyze the reliability of this new networking technology. New published methods adapt to this feature by treating the configuration probabilistically or by inclusion of embedded mobility models. This paper joins both methods together and expands upon these works by modifying the problem formulation to address the reliability analysis of a cluster-based MAWN. The cluster-based MAWN is deployed in applications with constraints on networking resources such as bandwidth and energy. This paper presents the problem's formulation, a discussion of applicable reliability metrics for the MAWN, and illustration of a Monte Carlo simulation method through the analysis of several example networks.

  8. Analysis of effect factors-based stochastic network planning model

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Looking at all the indeterminate factors as a whole and regarding activity durations as independent random variables,the traditional stochastic network planning models ignore the inevitable relationship and dependence among activity durations when more than one activity is possibly affected by the same indeterminate factors.On this basis of analysis of indeterminate effect factors of durations,the effect factors-based stochastic network planning (EFBSNP) model is proposed,which emphasizes on the effects of not only logistic and organizational relationships,but also the dependent relationships,due to indeterminate factors among activity durations on the project period.By virtue of indeterminate factor analysis the model extracts and describes the quantitatively indeterminate effect factors,and then takes into account the indeterminate factors effect schedule by using the Monte Carlo simulation technique.The method is flexible enough to deal with effect factors and is coincident with practice.A software has been developed to simplify the model-based calculation,in VisualStudio.NET language.Finally,a case study is included to demonstrate the applicability of the proposed model and comparison is made with some advantages over the existing models.

  9. Analysis of Android Device-Based Solutions for Fall Detection

    Directory of Open Access Journals (Sweden)

    Eduardo Casilari

    2015-07-01

    Full Text Available Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources to fall detection solutions.

  10. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Directory of Open Access Journals (Sweden)

    Edi Sutoyo

    Full Text Available Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  11. Reduction EMI of BLDC Motor Drive Based on Software Analysis

    Directory of Open Access Journals (Sweden)

    Navid Mousavi

    2016-01-01

    Full Text Available In the BLDC motor-drive system, the leakage current from a motor to a ground network and existence of high-frequency components of the DC link current are the most important factors that cause conducting interference. The leakage currents of the motors, flow through common ground, will interfere with other equipment because of the high density of electrical and electronic systems in the spacecraft and aircrafts. Moreover, generally there are common DC buses in the mentioned systems, which aggravate the problem. Function of the electric motor causes appearance of the high-frequency components in the DC link current, which can interfere with other subsystems. In this paper, the analysis of electromagnetic noise and presentation of the proposed method based on the frequency spectrum of the DC link current and the leakage current from the motor to the ground network are done. The proposed method presents a new process based on the filtering method to overcome EMI. To cover the requirement analysis, the Maxwell software is used.

  12. An Efficient Soft Set-Based Approach for Conflict Analysis

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%. PMID:26928627

  13. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2016-12-22

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  14. An Efficient Soft Set-Based Approach for Conflict Analysis.

    Science.gov (United States)

    Sutoyo, Edi; Mungad, Mungad; Hamid, Suraya; Herawan, Tutut

    2016-01-01

    Conflict analysis has been used as an important tool in economic, business, governmental and political dispute, games, management negotiations, military operations and etc. There are many mathematical formal models have been proposed to handle conflict situations and one of the most popular is rough set theory. With the ability to handle vagueness from the conflict data set, rough set theory has been successfully used. However, computational time is still an issue when determining the certainty, coverage, and strength of conflict situations. In this paper, we present an alternative approach to handle conflict situations, based on some ideas using soft set theory. The novelty of the proposed approach is that, unlike in rough set theory that uses decision rules, it is based on the concept of co-occurrence of parameters in soft set theory. We illustrate the proposed approach by means of a tutorial example of voting analysis in conflict situations. Furthermore, we elaborate the proposed approach on real world dataset of political conflict in Indonesian Parliament. We show that, the proposed approach achieves lower computational time as compared to rough set theory of up to 3.9%.

  15. Emergy analysis of cassava-based fuel ethanol in China

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Hui; Chen, Li; Yan, Zongcheng; Wang, Honglin [School of Chemistry and Chemical Engineering, South China University of Technology, Guangzhou, Guangdong 510640 (China)

    2011-01-15

    Emergy analysis considers both energy quality and energy used in the past, and compensates for the inability of money to value non-market inputs in an objective manner. Its common unit allows all resources to be compared on a fair basis. As feedstock for fuel ethanol, cassava has some advantages over other feedstocks. The production system of cassava-based fuel ethanol (CFE) was evaluated by emergy analysis. The emergy indices for the system of cassava-based fuel ethanol (CFE) are as follows: transformity is 1.10 E + 5 sej/J, EYR is 1.07, ELR is 2.55, RER is 0.28, and ESI is 0.42. Compared with the emergy indices of wheat ethanol and corn ethanol, CFE is the most sustainable. CFE is a good alternative to substitute for oil in China. Non-renewable purchased emergy accounts for 71.15% of the whole input emergy. The dependence on non-renewable energy increases environmental degradation, making the system less sustainable relative to systems more dependent on renewable energies. For sustainable development, it is vital to reduce the consumption of non-renewable energy in the production of CFE. (author)

  16. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Directory of Open Access Journals (Sweden)

    Shadi A. Issa

    2013-01-01

    Full Text Available Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client’s site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation.

  17. Dermoscopy analysis of RGB-images based on comparative features

    Science.gov (United States)

    Myakinin, Oleg O.; Zakharov, Valery P.; Bratchenko, Ivan A.; Artemyev, Dmitry N.; Neretin, Evgeny Y.; Kozlov, Sergey V.

    2015-09-01

    In this paper, we propose an algorithm for color and texture analysis for dermoscopic images of human skin based on Haar wavelets, Local Binary Patterns (LBP) and Histogram Analysis. This approach is a modification of «7-point checklist» clinical method. Thus, that is an "absolute" diagnostic method because one is using only features extracted from tumor's ROI (Region of Interest), which can be selected manually and/or using a special algorithm. We propose additional features extracted from the same image for comparative analysis of tumor and healthy skin. We used Euclidean distance, Cosine similarity, and Tanimoto coefficient as comparison metrics between color and texture features extracted from tumor's and healthy skin's ROI separately. A classifier for separating melanoma images from other tumors has been built by SVM (Support Vector Machine) algorithm. Classification's errors with and without comparative features between skin and tumor have been analyzed. Significant increase of recognition quality with comparative features has been demonstrated. Moreover, we analyzed two modes (manual and automatic) for ROI selecting on tumor and healthy skin areas. We have reached 91% of sensitivity using comparative features in contrast with 77% of sensitivity using the only "absolute" method. The specificity was the invariable (94%) in both cases.

  18. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  19. Kinematics Analysis Based on Screw Theory of a Humanoid Robot

    Institute of Scientific and Technical Information of China (English)

    MAN Cui-hua; FAN Xun; LI Cheng-rong; ZHAO Zhong-hui

    2007-01-01

    A humanoid robot is a complex dynamic system for its idiosyncrasy. This paper aims to provide a mathematical and theoretical foundation for the design of the configuration, kinematics analysis of a novel humanoid robot. It has a simplified configuration and design for entertainment purpose. The design methods, principle and mechanism are discussed. According to the design goals of this research, there are ten degrees of freedom in the two bionic arms.Modularization, concurrent design and extension theory methods were adopted in the configuration study and screw theory was introduced into the analysis of humanoid robot kinematics. Comparisons with other methods show that: 1) only two coordinates need to be established in the kinematics analysis of humanoid robot based on screw theory; 2) the spatial manipulator Jacobian obtained by using twist and exponential product formula is succinct and legible; 3) adopting screw theory to resolve the humanoid robot arms kinematics question can avoid singularities; 4) using screw theory can solve the question of specification insufficiency.

  20. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  1. SVM-based glioma grading. Optimization by feature reduction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zoellner, Frank G.; Schad, Lothar R. [University Medical Center Mannheim, Heidelberg Univ., Mannheim (Germany). Computer Assisted Clinical Medicine; Emblem, Kyrre E. [Massachusetts General Hospital, Charlestown, A.A. Martinos Center for Biomedical Imaging, Boston MA (United States). Dept. of Radiology; Harvard Medical School, Boston, MA (United States); Oslo Univ. Hospital (Norway). The Intervention Center

    2012-11-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values ({proportional_to}87%) while reducing the number of features by up to 98%. (orig.)

  2. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    Science.gov (United States)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  3. Structural Optimization of Slender Robot Arm Based on Sensitivity Analysis

    Directory of Open Access Journals (Sweden)

    Zhong Luo

    2012-01-01

    Full Text Available An effective structural optimization method based on a sensitivity analysis is proposed to optimize the variable section of a slender robot arm. The structure mechanism and the operating principle of a polishing robot are introduced firstly, and its stiffness model is established. Then, a design of sensitivity analysis method and a sequential linear programming (SLP strategy are developed. At the beginning of the optimization, the design sensitivity analysis method is applied to select the sensitive design variables which can make the optimized results more efficient and accurate. In addition, it can also be used to determine the scale of moving step which will improve the convergency during the optimization process. The design sensitivities are calculated using the finite difference method. The search for the final optimal structure is performed using the SLP method. Simulation results show that the proposed structure optimization method is effective in enhancing the stiffness of the robot arm regardless of the robot arm suffering either a constant force or variable forces.

  4. GIS-BASED SPATIAL STATISTICAL ANALYSIS OF COLLEGE GRADUATES EMPLOYMENT

    Directory of Open Access Journals (Sweden)

    R. Tang

    2012-07-01

    Full Text Available It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004–2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  5. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    Science.gov (United States)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  6. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  7. Web Based Image Retrieval System Using Color, Texture and Shape Analysis: Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Amol P Bhagat

    2013-09-01

    Full Text Available The internet is one of the best media to disseminate scientific and technological research results [1, 2, 6]. It deals with the implementation of a web-based extensible architecture that is easily integral with applications written in different languages and linkable with different data sources. This paper work deals with developing architecture which is expandable and modular; its client–server functionalities permit easily building web applications that can be run using any Internet browser without compatibility problems regarding platform, program and operating system installed. This paper presents the implementation of Content Based Image Retrieval using different methods of color, texture and shape analysis. The primary objective is to compare the different methods of image analysis.

  8. Dynamic chest image analysis: model-based pulmonary perfusion analysis with pyramid images

    Science.gov (United States)

    Liang, Jianming; Haapanen, Arto; Jaervi, Timo; Kiuru, Aaro J.; Kormano, Martti; Svedstrom, Erkki; Virkki, Raimo

    1998-07-01

    The aim of the study 'Dynamic Chest Image Analysis' is to develop computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected at different phases of the respiratory/cardiac cycles in a short period of time. We have proposed a framework for ventilation study with an explicit ventilation model based on pyramid images. In this paper, we extend the framework to pulmonary perfusion study. A perfusion model and the truncated pyramid are introduced. The perfusion model aims at extracting accurate, geographic perfusion parameters, and the truncated pyramid helps in understanding perfusion at multiple resolutions and speeding up the convergence process in optimization. Three cases are included to illustrate the experimental results.

  9. Analysis on electric energy measuring method based on multi-resolution analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xiao-bing; CUI Jia-rui; LIANG Yuan-hua; WANG Mu-kun

    2006-01-01

    Along with the massive applications of the non-linear loads and the impact loads, many non-stationary stochastic signals such as harmonics, inter-harmonics, impulse signals and so on are introduced into the electric network, and these non-stationary stochastic signals have had effects on the accuracy of the measurement of electric energy. The traditional method like Fourier Analysis can be applied efficiently on the stationary stochastic signals, but it has little effect on non-stationary stochastic signals. In light of this, the form of the signals of the electric network in wavelet domain will be discussed in this paper. A measurement method of active power based on multi-resolution analysis in the stochastic process is presented. This method has a wider application scope compared with the traditional method Fourier analysis, and it is of good referential value and practical value in terms of raising the level of the existing electric energy measurement.

  10. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  11. A practical approach to object based requirements analysis

    Science.gov (United States)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  14. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    Directory of Open Access Journals (Sweden)

    Zhiming Song

    2015-01-01

    Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  15. A novel multiobjective evolutionary algorithm based on regression analysis.

    Science.gov (United States)

    Song, Zhiming; Wang, Maocai; Dai, Guangming; Vasile, Massimiliano

    2015-01-01

    As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m - 1)-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA) is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m - 1)-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  16. Model-free Kinetic Study of Phenol Formaldehyde Resin Cure%酚醛树脂固化的非模型拟合动力学研究

    Institute of Scientific and Technical Information of China (English)

    叶晓川; 曾黎明; 张超; 陈雷

    2012-01-01

    The curing behavior of novolac type phenolic resin was studied through differential scanning calorimetry (DSC) using model-free (isoconversion) kinetic methods. In the model-free kinetic study, Friedman, Flynn-Wall-Ozawa, and Kissinger-Akahira-Sunose methods were used to calculate the curing activation energy. The dependence curve of activation energy on conversion and the piecewise model fitting results both displayed a change in the kinetic reaction mechanism from the autocatalytic to n-order regime. According to the isothermal prediction results, the model-free kinetic methods show excellent prediction ability. Also, Flynn-Wall-Ozawa and Kissinger-Akahira-Sunose methods were found to show similar results in analyzing the curing behavior of this novolac resin.%根据不等温差示扫描量热曲线,采用非模型拟合动力学法对酚醛树脂的固化动力学进行了研究.基于Friedman,Flynn-Wall-Ozawa和Kissinger-Akahira-Sunose 3种模型计算得到了固化速率及活化能与固化度的关系,进而对此树脂体系的固化反应阶段及固化行为进行了分析,并对等温固化行为进行预测.结果表明,固化反应逐渐从自催化反应机理转变为n阶反应机理,与模型拟合法研究的结果基本吻合.对Friedman,Flynn-Wall-Ozawa,Kissinger-Akahira-Sunose 3种动力学方法进行比较发现,3种方法都显示了较好的拟合结果.其中,Flynn-Wall-Ozawa和Kissinger-Akahira-Sunose两种方法的结果十分接近,并能对固化行为进行准确地反映.

  17. 无模型自适应广域阻尼控制设计方法%Design of Model-Free Adaptive Wide Area Damping Controller

    Institute of Scientific and Technical Information of China (English)

    李建; 赵艺; 陆超; 庞晓艳

    2014-01-01

    电力系统的运行方式和结构的时变特性越来越突出,传统广域阻尼控制器(conventional wide-area power system stabilizer,CWAPSS)由于参数固定,无法保证系统在不同运行状态下的控制效果。为此,文章采用无模型自适应控制(model free adaptive control,MFAC)算法实现了具有自适应功能的广域阻尼控制器的设计。首先讨论了 MFAC 算法的基本原理;然后通过分析比较WAPSS和MFAC算法参数之间的关系,给出了 MFAC 算法的参数设置方法;最后,通过四机两区系统的仿真验证了 MFAC 算法的有效性。仿真结果表明在系统结构发生变化时,无模型自适应广域阻尼控制器能够在线调整控制器参数,控制效果优于CWAPSS。%The time-varying characteristics of power grid operating modes and structure become more and more conspicuous, and due to fixed parameters of traditional wide-area power system stabilizer (WAPSS) the control effect of power grid under various operating conditions could not be ensured. Utilizing model-free adaptive control (MFAC) algorithm the authors implement the design of adaptive wide-area damping controller. Firstly, basic principle of MFAC is discussed; then the relationship between parameters of WAPSS and those of MFAC are analyzed and compared and the method of parameter setting of MFAC algorithm is given; finally, through simulation of a 4-machine 2-area system the effectiveness of MFAC algorithm is validated. Simulation results show that during the change of power system structure the online adjustment of parameters of the model-free adaptive wide-area damping controller can be implemented, and its control effect is better than WAPSS.

  18. GPU-based Integration with Application in Sensitivity Analysis

    Science.gov (United States)

    Atanassov, Emanouil; Ivanovska, Sofiya; Karaivanova, Aneta; Slavov, Dimitar

    2010-05-01

    The presented work is an important part of the grid application MCSAES (Monte Carlo Sensitivity Analysis for Environmental Studies) which aim is to develop an efficient Grid implementation of a Monte Carlo based approach for sensitivity studies in the domains of Environmental modelling and Environmental security. The goal is to study the damaging effects that can be caused by high pollution levels (especially effects on human health), when the main modeling tool is the Danish Eulerian Model (DEM). Generally speaking, sensitivity analysis (SA) is the study of how the variation in the output of a mathematical model can be apportioned to, qualitatively or quantitatively, different sources of variation in the input of a model. One of the important classes of methods for Sensitivity Analysis are Monte Carlo based, first proposed by Sobol, and then developed by Saltelli and his group. In MCSAES the general Saltelli procedure has been adapted for SA of the Danish Eulerian model. In our case we consider as factors the constants determining the speeds of the chemical reactions in the DEM and as output a certain aggregated measure of the pollution. Sensitivity simulations lead to huge computational tasks (systems with up to 4 × 109 equations at every time-step, and the number of time-steps can be more than a million) which motivates its grid implementation. MCSAES grid implementation scheme includes two main tasks: (i) Grid implementation of the DEM, (ii) Grid implementation of the Monte Carlo integration. In this work we present our new developments in the integration part of the application. We have developed an algorithm for GPU-based generation of scrambled quasirandom sequences which can be combined with the CPU-based computations related to the SA. Owen first proposed scrambling of Sobol sequence through permutation in a manner that improves the convergence rates. Scrambling is necessary not only for error analysis but for parallel implementations. Good scrambling is

  19. Improvements on Particle Tracking Velocimetry: model-free calibration and noiseless measurement of second order statistics of the velocity field

    CERN Document Server

    Machicoane, Nathanael; Bourgoin, Mickael; Aliseda, Alberto; Volk, Romain

    2016-01-01

    This article describes two independent developments aimed at improving the Particle Tracking Method for measurements of flow or particle velocities. First, a stereoscopic multicamera calibration method that does not require any optical model is described and evaluated. We show that this new calibration method gives better results than the most commonly-used technique, based on the Tsai camera/optics model. Additionally, the methods uses a simple interpolant to compute the transformation matrix and it is trivial to apply for any experimental fluid dynamics visualization set up. The second contribution proposes a solution to remove noise from Eulerian measurements of velocity statistics obtained from Particle Tracking velocimetry, without the need of filtering and/or windowing. The novel method presented here is based on recomputing particle displacement measurements from two consecutive frames for multiple different time-step values between frames. We show the successful application of this new technique to re...

  20. Subpathway Analysis based on Signaling-Pathway Impact Analysis of Signaling Pathway.

    Directory of Open Access Journals (Sweden)

    Xianbin Li

    Full Text Available Pathway analysis is a common approach to gain insight from biological experiments. Signaling-pathway impact analysis (SPIA is one such method and combines both the classical enrichment analysis and the actual perturbation on a given pathway. Because this method focuses on a single pathway, its resolution generally is not very high because the differentially expressed genes may be enriched in a local region of the pathway. In the present work, to identify cancer-related pathways, we incorporated a recent subpathway analysis method into the SPIA method to form the "sub-SPIA method." The original subpathway analysis uses the k-clique structure to define a subpathway. However, it is not sufficiently flexible to capture subpathways with complex structure and usually results in many overlapping subpathways. We therefore propose using the minimal-spanning-tree structure to find a subpathway. We apply this approach to colorectal cancer and lung cancer datasets, and our results show that sub-SPIA can identify many significant pathways associated with each specific cancer that other methods miss. Based on the entire pathway network in the Kyoto Encyclopedia of Genes and Genomes, we find that the pathways identified by sub-SPIA not only have the largest average degree, but also are more closely connected than those identified by other methods. This result suggests that the abnormality signal propagating through them might be responsible for the specific cancer or disease.

  1. Partner Selection Analysis and System Development Based on Gray Relation Analysis for an Agile Virtual Enterprise

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    This paper analyzes the state of the art of partner selection and enumerates the advantage of partner selection based on gray relation analysis comparing to the other algorithms of the partner selection. Furthermore, partner selection system based on gray relation for an Agile Virtual Enterprise(AVE) is analyzed and designed based on the definition and characteristics of the AVE. According to J2EE mode, the architecture of the partner selection system is put forward and the system is developed using JSP, EJB and SQL Server. The paper lays emphasis on a gray relational mathematic model, AVE evaluation infrastructure, a core algorithm of partner selection and a multi-layer gray relation selection process.

  2. Rasch model based analysis of the Force Concept Inventory

    Directory of Open Access Journals (Sweden)

    Maja Planinic

    2010-03-01

    Full Text Available The Force Concept Inventory (FCI is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct. The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17–18 years. The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4%, indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian, an additional predominantly Newtonian sample (N=141, average FCI score of 64.5% of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further

  3. Rasch model based analysis of the Force Concept Inventory

    Science.gov (United States)

    Planinic, Maja; Ivanjek, Lana; Susac, Ana

    2010-06-01

    The Force Concept Inventory (FCI) is an important diagnostic instrument which is widely used in the field of physics education research. It is therefore very important to evaluate and monitor its functioning using different tools for statistical analysis. One of such tools is the stochastic Rasch model, which enables construction of linear measures for persons and items from raw test scores and which can provide important insight in the structure and functioning of the test (how item difficulties are distributed within the test, how well the items fit the model, and how well the items work together to define the underlying construct). The data for the Rasch analysis come from the large-scale research conducted in 2006-07, which investigated Croatian high school students’ conceptual understanding of mechanics on a representative sample of 1676 students (age 17-18 years). The instrument used in research was the FCI. The average FCI score for the whole sample was found to be (27.7±0.4)% , indicating that most of the students were still non-Newtonians at the end of high school, despite the fact that physics is a compulsory subject in Croatian schools. The large set of obtained data was analyzed with the Rasch measurement computer software WINSTEPS 3.66. Since the FCI is routinely used as pretest and post-test on two very different types of population (non-Newtonian and predominantly Newtonian), an additional predominantly Newtonian sample ( N=141 , average FCI score of 64.5%) of first year students enrolled in introductory physics course at University of Zagreb was also analyzed. The Rasch model based analysis suggests that the FCI has succeeded in defining a sufficiently unidimensional construct for each population. The analysis of fit of data to the model found no grossly misfitting items which would degrade measurement. Some items with larger misfit and items with significantly different difficulties in the two samples of students do require further examination

  4. Visual traffic jam analysis based on trajectory data.

    Science.gov (United States)

    Wang, Zuchao; Lu, Min; Yuan, Xiaoru; Zhang, Junping; van de Wetering, Huub

    2013-12-01

    In this work, we present an interactive system for visual analysis of urban traffic congestion based on GPS trajectories. For these trajectories we develop strategies to extract and derive traffic jam information. After cleaning the trajectories, they are matched to a road network. Subsequently, traffic speed on each road segment is computed and traffic jam events are automatically detected. Spatially and temporally related events are concatenated in, so-called, traffic jam propagation graphs. These graphs form a high-level description of a traffic jam and its propagation in time and space. Our system provides multiple views for visually exploring and analyzing the traffic condition of a large city as a whole, on the level of propagation graphs, and on road segment level. Case studies with 24 days of taxi GPS trajectories collected in Beijing demonstrate the effectiveness of our system.

  5. Analysis of quantitative pore features based on mathematical morphology

    Institute of Scientific and Technical Information of China (English)

    QI Heng-nian; CHEN Feng-nong; WANG Hang-jun

    2008-01-01

    Wood identification is a basic technique of wood science and industry. Pore features are among the most important identification features for hardwoods. We have used a method based on an analysis of quantitative pore feature, which differs from traditional qualitative methods. We applies mathematical morphology methods such as dilation and erosion, open and close transformation of wood cross-sections, image repairing, noise filtering and edge detection to segment the pores from their background. Then the mean square errors (MSE) of pores were computed to describe the distribution of pores. Our experiment shows that it is easy to classift the pore features into three basic types, just as in traditional qualitative methods, but with the use of MSE of pores. This quantitative method improves wood identification considerably.

  6. GIS Based Spatial Data Analysis for Landslide Susceptibility Mapping

    Institute of Scientific and Technical Information of China (English)

    S.Sarkar; D.P.Kanungo; A.K.Patra; Pushpendra Kumar

    2008-01-01

    Landslide susceptibility map delineates the potential zones for landslides occurrence.The paper presents a statistical approach through spatial data analysis in GIS for landslide susceptibility mapping in parts of Sikkim Himalaya.Six important causative factors for landslide occurrences were selected and corresponding thematic data layers were prepared in GIS.Topographic maps,satellite image,field data and published maps constitute the input data for thematic layer preparation.Numerical weights for different categories of these factors were determined based on a statistical approach and the weighted thematic layers were integrated in GIS environment to generate the landslide susceptibility map of the area.The landslide susceptibility map classifies the area into five different landslide susceptible zones i.e.,very high,high,moderate,low and very low.This map was validated using the existing landslide distribution in the area.

  7. Windows Volatile Memory Forensics Based on Correlation Analysis

    Directory of Open Access Journals (Sweden)

    Xiaolu Zhang

    2014-03-01

    Full Text Available In this paper, we present an integrated memory forensic solution for multiple Windows memory images. By calculation, the method can find out the correlation degree among the processes of volatile memory images and the hidden clues behind the events of computers, which is usually difficult to be obtained and easily ignored by analyzing one single memory image and forensic investigators. In order to test the validity, we performed an experiment based on two hosts' memory image which contains criminal incidents. According to the experimental result, we find that the event chains reconstructed by our method are similar to the actual actions in the criminal scene. Investigators can review the digital crime scenario which is contained in the data set by analyzing the experimental results. This paper is aimed at finding the valid actions with illegal attempt and making the memory analysis not to be utterly dependent on the operating system and relevant experts.

  8. Clinical gait data analysis based on Spatio-Temporal features

    CERN Document Server

    Katiyar, Rohit

    2010-01-01

    Analysing human gait has found considerable interest in recent computer vision research. So far, however, contributions to this topic exclusively dealt with the tasks of person identification or activity recognition. In this paper, we consider a different application for gait analysis and examine its use as a means of deducing the physical well-being of people. The proposed method is based on transforming the joint motion trajectories using wavelets to extract spatio-temporal features which are then fed as input to a vector quantiser; a self-organising map for classification of walking patterns of individuals with and without pathology. We show that our proposed algorithm is successful in extracting features that successfully discriminate between individuals with and without locomotion impairment.

  9. Descriptor Based Analysis of Digital 3D Shapes

    DEFF Research Database (Denmark)

    Welnicka, Katarzyna

    challenges. One such challenge, which is addressed in this thesis, is to develop computational methods for classifying shapes which are in agreement with the human way of understanding and classifying shapes. In this dissertation we first present a shape descriptor based on the process of diffusion......Analysis and processing of 3D digital shapes is a significant research area with numerous medical, industrial, and entertainment applications which has gained enormously in importance as optical scanning modalities have started to make acquired 3D geometry commonplace. The area holds many......, in conjunction with the method of Reeb graphs for skeletonization, it is an effective tool for generating scale dependent skeletons of shapes represented as 3D triangle meshes. The second part of the thesis aims at capturing the style phenomenon. The style of an object is easily recognized by humans...

  10. Dependence Analysis Based on Dynamic Slicing for Debugging

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Dynamic program slicing is an effective technique for narrowingthe errors to t h e relevant parts of a program when debugging. Given a slicing criterion, the dyn amic slice contains only those statements that actually affect the variables in the slicing criterion. This paper proposes a dynamic slicing method based on sta tic dependence analysis. It uses the program dependence graph and other static i nformation to reduce the information needed to be traced during program executio n. Thus, the efficiency is dramatically improved while the precision is not depr e ssed. The slicing criterion is modified to fit for debugging. It consists of fil e-name and the line number at which the statement is.

  11. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  12. GIS application on spatial landslide analysis using statistical based models

    Science.gov (United States)

    Pradhan, Biswajeet; Lee, Saro; Buchroithner, Manfred F.

    2009-09-01

    This paper presents the assessment results of spatially based probabilistic three models using Geoinformation Techniques (GIT) for landslide susceptibility analysis at Penang Island in Malaysia. Landslide locations within the study areas were identified by interpreting aerial photographs, satellite images and supported with field surveys. Maps of the topography, soil type, lineaments and land cover were constructed from the spatial data sets. There are ten landslide related factors were extracted from the spatial database and the frequency ratio, fuzzy logic, and bivariate logistic regression coefficients of each factor was computed. Finally, landslide susceptibility maps were drawn for study area using frequency ratios, fuzzy logic and bivariate logistic regression models. For verification, the results of the analyses were compared with actual landslide locations in study area. The verification results show that bivariate logistic regression model provides slightly higher prediction accuracy than the frequency ratio and fuzzy logic models.

  13. Similarity theory based method for MEMS dynamics analysis

    Institute of Scientific and Technical Information of China (English)

    LI Gui-xian; PENG Yun-feng; ZHANG Xin

    2008-01-01

    A new method for MEMS dynamics analysis is presented, ased on the similarity theory. With this method, two systems' similarities can be captured in terms of physics quantities/governed-equations amongst different energy fields, and then the unknown dynamic characteristics of one of the systems can be analyzed ac-cording to the similar ones of the other system. The probability to establish a pair of similar systems among MEMS and other energy systems is also discussed based on the equivalent between mechanics and electrics, and then the feasibility of applying this method is proven by an example, in which the squeezed damping force in MEMS and the current of its equivalent circuit established by this method are compared.

  14. Nonlinear fault diagnosis method based on kernel principal component analysis

    Institute of Scientific and Technical Information of China (English)

    Yan Weiwu; Zhang Chunkai; Shao Huihe

    2005-01-01

    To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.

  15. Fuzzy MCDM Based on Fuzzy Relational Degree Analysis

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents a new fuzzy multiple criteria (both qualitative and quantitative) decision-making (MCDM) method based on fuzzy relational degree analysis. The concepts of fuzzy set theory are used to construct a weighted suitability decision matrix to evaluate the weighted suitability of different alternatives versus various criteria. The positive ideal solution and negative ideal solution are then obtained by using a method of ranking fuzzy numbers, and the fuzzy relational degrees of different alternatives versus positive ideal solution and negative ideal solution are calculated by using the proposed arithmetic. Finally, the relative relational degrees of various alternatives versus positive ideal solution are ranked to determine the best alternative. A numerical example is provided to illustrate the proposed method at the end of this paper.

  16. Management of Microbiologically Influenced Corrosion in Risk Based Inspection analysis

    DEFF Research Database (Denmark)

    Skovhus, Torben Lund; Hillier, Elizabeth; Andersen, Erlend S.

    2016-01-01

    in the offshore industry as a means to justify the inspection strategy adopted. The RBI analysis is a decision-making technique that enables asset managers to identify the risk related to failure of their most critical systems and components, with an effect on safety, environmental and business related issues....... Microbiologically Influenced Corrosion (MIC) is a degradation mechanism that has received increased attention from corrosion engineers and asset operators in the past decades. In this paper, the most recent models that have been developed in order to assess the impact of MIC on asset integrity will be presented...... and discussed. From a risk perspective, MIC is not satisfactorily assessed by the current models and the models lack a proper view of the MIC threat. Therefore, a review of known parameters that affect MIC is presented. The mapping and identification of parameters is based on the review of past models...

  17. UNRAVELING ECOTOURISM PRACTICE:PROBLEM ANALYSIS BASED ON STAKEHOLDERS

    Institute of Scientific and Technical Information of China (English)

    LIU Xue-mei; BAO Ji-gang

    2004-01-01

    Despite the considerable literatures defined what Ecotourism is or should be, it is experiencing various practices with different features. Now the term "Ecotourism" is almost applied to all tourism activities which are based on nature. Faced to the flooding of those unqualified Ecotourism, it is of great necessity to put forward professional claim. The present writer holds that the key to the realization of rigorous Ecotourism chiefly lies in the relationships among the different interest groups involved in it. So the focus of this paper is just on giving a special analysis to the interest relations between those stakeholders which include local govemment, tour-operators, local residents and eco-tourists, and thus helping to find out what wrong is in those unqualified Ecotourism and the roots of those problems.

  18. Virtual Estimator for Piecewise Linear Systems Based on Observability Analysis

    Directory of Open Access Journals (Sweden)

    Ilse Cervantes

    2013-02-01

    Full Text Available This article proposes a virtual sensor for piecewise linear systems based on observability analysis that is in function of a commutation law related with the system’s outpu. This virtual sensor is also known as a state estimator. Besides, it presents a detector of active mode when the commutation sequences of each linear subsystem are arbitrary and unknown. For the previous, this article proposes a set of virtual estimators that discern the commutation paths of the system and allow estimating their output. In this work a methodology in order to test the observability for piecewise linear systems with discrete time is proposed. An academic example is presented to show the obtained results.

  19. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  20. Architecture Analysis of an FPGA-Based Hopfield Neural Network

    Directory of Open Access Journals (Sweden)

    Miguel Angelo de Abreu de Sousa

    2014-01-01

    Full Text Available Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.

  1. A Frame-Based Analysis of Synaesthetic Metaphors

    Directory of Open Access Journals (Sweden)

    Hakan Beseoglu

    2008-08-01

    Full Text Available The aim of this paper is to use a frame-based account to explain some empirical findings regarding the accessibility of synaesthetic metaphors. Therefore, some results of empirical studies will be discussed with regard to the question of how much it matters whether the concept of the source domain in a synaesthetic metaphor is a scalar or a quality concept. Furthermore, typed frames are introduced, and it is explained how the notion of a minimal upper attribute can be used in the analysis of adjective-noun compounds. Finally, frames are used to analyze synaesthetic metaphors; it turns out that they offer an adequate basis for the explanation of different accessibility rates found in empirical studies.

  2. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, es-pecially in view of the enormous amount of information available in computer-based supervision systems...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...... was to develop a software tool for maintenance supervision of components in a nuclear power plant....

  3. Analysis of equivalent antenna based on FDTD method

    Institute of Scientific and Technical Information of China (English)

    Yun-xing YANG; Hui-chang ZHAO; Cui DI

    2014-01-01

    An equivalent microstrip antenna used in radio proximity fuse is presented. The design of this antenna is based on multilayer multi-permittivity dielectric substrate which is analyzed by finite difference time domain (FDTD) method. Equivalent iterative formula is modified in the condition of cylindrical coordinate system. The mixed substrate which contains two kinds of media (one of them is air)takes the place of original single substrate. The results of equivalent antenna simulation show that the resonant frequency of equivalent antenna is similar to that of the original antenna. The validity of analysis can be validated by means of antenna resonant frequency formula. Two antennas have same radiation pattern and similar gain. This method can be used to reduce the weight of antenna, which is significant to the design of missile-borne antenna.

  4. Choosing a Commercial Broiler Strain Based on Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Hosseini SA

    2014-05-01

    Full Text Available With the complexity and amount of information in a wide variety of comparative performance reports in poultry production, making a decision is difficult. This problem is overcomed only when all data can be put into a common unit. For this purpose, five different decision making analysis approaches including  Maximin, Equally likely, Weighted average, Ordered weighted averages and Technique for order preference by similarity to ideal solution were used to choose the best broiler strain among three ones based on their comparative performance and carcass characteristics. Commercial broiler strains of 6000 designated as R, A, and C (each strain 2000 were randomly allocated into three treatments of five replicates. In this study, all methods showed similar results except Maximin approach. Comparing different methods indicated that strain C with the highest world share market has the best performance followed by strains R and A.

  5. Identification and annotation of erotic film based on content analysis

    Science.gov (United States)

    Wang, Donghui; Zhu, Miaoliang; Yuan, Xin; Qian, Hui

    2005-02-01

    The paper brings forward a new method for identifying and annotating erotic films based on content analysis. First, the film is decomposed to video and audio stream. Then, the video stream is segmented into shots and key frames are extracted from each shot. We filter the shots that include potential erotic content by finding the nude human body in key frames. A Gaussian model in YCbCr color space for detecting skin region is presented. An external polygon that covered the skin regions is used for the approximation of the human body. Last, we give the degree of the nudity by calculating the ratio of skin area to whole body area with weighted parameters. The result of the experiment shows the effectiveness of our method.

  6. First law-based thermodynamic analysis on Kalina cycle

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Based on the first law of thermodynamics,and adopting the Peng-Robinson equation (P-R equation) as the basic equation for the properties of ammonia-water mixtures,a thermodynamic analysis on a single-stage distillation Kalina cycle is presented.A program to calculate the thermodynamic properties of ammoniawater mixtures,and that for calculating the performance of Kalina cycles,were developed,with which the heatwork conversion particulars of Kalina cycles were theoretically calculated.The influences on the cycle performance of key parameters,such as the pressure and temperature at the inlet of the turbine,the back pressure of the turbine,the concentration of the working solution,the concentration of the basic solution and the cycle multiplication ratio,were analyzed.

  7. Evolving model-free scattering matrix via evolutionary algorithm: $^{16}$O-$^{16}$O elastic scattering at 350 MeV

    CERN Document Server

    Korda, V Y; Korda, L P

    2005-01-01

    We present a new procedure which enables to extract a scattering matrix $S(l)$ as a complex function of angular momentum directly from the scattering data, without any a priori model assumptions implied. The key ingredient of the procedure is the evolutionary algorithm with diffused mutation which evolves the population of the scattering matrices, via their smooth deformations, from the primary arbitrary analytical $S(l)$ shapes to the final ones giving high quality fits to the data. Due to the automatic monitoring of the scattering matrix derivatives, the final $S(l)$ shapes are monotonic and do not have any distortions. For the $^{16}$O-$^{16}$O elastic scattering data at 350 MeV, we show the independence of the final results of the primary $S(l)$ shapes. Contrary to the other approaches, our procedure provides an excellent fit by the $S(l)$ shapes which support the ``rainbow'' interpretation of the data under analysis.

  8. A cost minimisation analysis in teledermatology: model-based approach

    Directory of Open Access Journals (Sweden)

    Eminović Nina

    2010-08-01

    Full Text Available Abstract Background Although store-and-forward teledermatology is increasingly becoming popular, evidence on its effects on efficiency and costs is lacking. The aim of this study, performed in addition to a clustered randomised trial, was to investigate to what extent and under which conditions store-and-forward teledermatology can reduce costs from a societal perspective. Methods A cost minimisation study design (a model based approach was applied to compare teledermatology and conventional process costs per dermatology patient care episode. Regarding the societal perspective, total mean costs of investment, general practitioner, dermatologists, out-of-pocket expenses and employer costs were calculated. Uncertainty analysis was performed using Monte Carlo simulation with 31 distributions in the used cost model. Scenario analysis was performed using one-way and two-way sensitivity analyses with the following variables: the patient travel distance to physician and dermatologist, the duration of teleconsultation activities, and the proportion of preventable consultations. Results Total mean costs of teledermatology process were €387 (95%CI, 281 to 502.5, while the total mean costs of conventional process costs were €354.0 (95%CI, 228.0 to 484.0. The total mean difference between the processes was €32.5 (95%CI, -29.0 to 74.7. Savings by teledermatology can be achieved if the distance to a dermatologist is larger (> = 75 km or when more consultations (> = 37% can be prevented due to teledermatology. Conclusions Teledermatology, when applied to all dermatology referrals, has a probability of 0.11 of being cost saving to society. In order to achieve cost savings by teledermatology, teledermatology should be applied in only those cases with a reasonable probability that a live consultation can be prevented. Trail Registration This study is performed partially based on PERFECT D Trial (Current Controlled Trials No.ISRCTN57478950.

  9. POSSIBILITY AND EVIDENCE-BASED RELIABILITY ANALYSIS AND DESIGN OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Hong-Zhong Huang

    2013-01-01

    Full Text Available Engineering design under uncertainty has gained considerable attention in recent years. A great multitude of new design optimization methodologies and reliability analysis approaches are put forth with the aim of accommodating various uncertainties. Uncertainties in practical engineering applications are commonly classified into two categories, i.e., aleatory uncertainty and epistemic uncertainty. Aleatory uncertainty arises because of unpredictable variation in the performance and processes of systems, it is irreducible even adding more data or knowledge. On the other hand, epistemic uncertainty stems from lack of knowledge of the system due to limited data, measurement limitations, or simplified approximations in modeling system behavior and it can be reduced by obtaining more data or knowledge. More specifically, aleatory uncertainty is naturally represented by a statistical distribution and its associated parameters can be characterized by sufficient data. If, however, the data is limited and can be quantified in a statistical sense, epistemic uncertainty can be considered as an alternative tool in such a situation. Of the several optional treatments for epistemic uncertainty, possibility theory and evidence theory have proved to be the most computationally efficient and stable for reliability analysis and engineering design optimization. This study first attempts to provide a better understanding of uncertainty in engineering design by giving a comprehensive overview of its classifications, theories and design considerations. Then a review is conducted of general topics such as the foundations and applications of possibility theory and evidence theory. This overview includes the most recent results from theoretical research, computational developments and performance improvement of possibility theory and evidence theory with an emphasis on revealing the capability and characteristics of quantifying uncertainty from different perspectives

  10. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    Yi DAI; Zhao-jun WANG; Chang-liang ZOU

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method.Sullivan and woodall pointed out the test statistic lrt (n1, n2) is approximately distributed as x2 (2) as the sample size n, n1 and n2 are very large, and the value of n1 = 2, 3,..., n- 2 and that of n2 = n- n1.So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained.In addition, the properties of the standardized likelihood ratio statistic slr(n1,n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i ≠ n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both.Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  11. CUSUM control charts based on likelihood ratio for preliminary analysis

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To detect and estimate a shift in either the mean and the deviation or both for the preliminary analysis, the statistical process control (SPC) tool, the control chart based on the likelihood ratio test (LRT), is the most popular method. Sullivan and woodall pointed out the test statistic lrt(n1, n2) is approximately distributed as x2(2) as the sample size n,n1 and n2 are very large, and the value of n1 = 2,3,..., n - 2 and that of n2 = n - n1. So it is inevitable that n1 or n2 is not large. In this paper the limit distribution of lrt(n1, n2) for fixed n1 or n2 is figured out, and the exactly analytic formulae for evaluating the expectation and the variance of the limit distribution are also obtained. In addition, the properties of the standardized likelihood ratio statistic slr(n1, n) are discussed in this paper. Although slr(n1, n) contains the most important information, slr(i, n)(i≠n1) also contains lots of information. The cumulative sum (CUSUM) control chart can obtain more information in this condition. So we propose two CUSUM control charts based on the likelihood ratio statistics for the preliminary analysis on the individual observations. One focuses on detecting the shifts in location in the historical data and the other is more general in detecting a shift in either the location and the scale or both. Moreover, the simulated results show that the proposed two control charts are, respectively, superior to their competitors not only in the detection of the sustained shifts but also in the detection of some other out-of-control situations considered in this paper.

  12. Performance Optimization based Spectrum Analysis on OFRA and EDFA Devices

    Directory of Open Access Journals (Sweden)

    Liu Liying

    2013-07-01

    Full Text Available As the key devices, erbium doped fiber amplifier (EDFA and optical Raman fiber amplifier (OFRA have been widely applied in the fields of optical communication, sensing and measurement. However, the performance optimization is always one of the hot topics in the study of optical fiber amplifiers, because its output characteristics are hardly dependent to the key designing parameters. In this paper, in order to cope with such problem, we adopt the novel analysis based spectrum to study the output performance of EDFA and OFRA systems, respectively. Through simulating the operation of the two amplifying system, their output characteristics are first demonstrated with the various parameters. And according to the numerical results obtained, the key designing parameters of EDFA and OFRA systems are determinate, and the performance of amplifying systems are improved and optimized obviously in terms of output power, signal noise ratio, and the level of gain flatness.   Keywords: Fiber Raman Amplifier, Erbium Doped Fiber Amplifier, Performance optimization, Spectrum analysis, Simulation.  

  13. GIS-BASED SURFACE ANALYSIS OF ARCHAEOLOGICAL FINDS

    Directory of Open Access Journals (Sweden)

    K. Kovács

    2012-09-01

    Full Text Available The international research project HiMAT (History of Mining Activities in the Tyrol and adjacent areas is dedicated to the study of mining history in the Eastern Alps by various scientific disciplines. The aim of this program is the analysis of the mining activities’ impacts on environment and human societies. Unfortunately, there is only a limited number of specific regions (e.g. Mitterberg to offer possibilities to investigate the former mining expansions. Within this multidisciplinary project, the archaeological sites and finds are analyzed by the Surveying and Geoinformation Unit at the University of Innsbruck. This paper shows data fusion of different surveying and post-processing methods to achieve a photo-realistic digital 3D model of one of these most important finds, the Bronze Age sluice box from the Mitterberg. The applied workflow consists of four steps: 1. Point cloud processing, 2. Meshing of the point clouds and editing of the models, 3. Image orientation, bundle and image adjustment, 4. Model texturing. In addition, a short range laser scanning survey was organized before the conservation process of this wooden find. More accurate research opportunities were offered after this detailed documentation of the sluice box, for example the reconstruction of the broken parts and the surface analysis of this archaeological object were implemented using these high-resolution datasets. In conclusion, various unperceived patterns of the wooden boards were visualized by the GIS-based tool marks investigation.

  14. Phylogenetic relationships of Malassezia species based on multilocus sequence analysis.

    Science.gov (United States)

    Castellá, Gemma; Coutinho, Selene Dall' Acqua; Cabañes, F Javier

    2014-01-01

    Members of the genus Malassezia are lipophilic basidiomycetous yeasts, which are part of the normal cutaneous microbiota of humans and other warm-blooded animals. Currently, this genus consists of 14 species that have been characterized by phenetic and molecular methods. Although several molecular methods have been used to identify and/or differentiate Malassezia species, the sequencing of the rRNA genes and the chitin synthase-2 gene (CHS2) are the most widely employed. There is little information about the β-tubulin gene in the genus Malassezia, a gene has been used for the analysis of complex species groups. The aim of the present study was to sequence a fragment of the β-tubulin gene of Malassezia species and analyze their phylogenetic relationship using a multilocus sequence approach based on two rRNA genes (ITS including 5.8S rRNA and D1/D2 region of 26S rRNA) together with two protein encoding genes (CHS2 and β-tubulin). The phylogenetic study of the partial β-tubulin gene sequences indicated that this molecular marker can be used to assess diversity and identify new species. The multilocus sequence analysis of the four loci provides robust support to delineate species at the terminal nodes and could help to estimate divergence times for the origin and diversification of Malassezia species.

  15. Dynamic mechanical analysis of double base rocket propellants

    Directory of Open Access Journals (Sweden)

    Marcin Cegła

    2016-03-01

    Full Text Available The article presents dynamic mechanical analysis (DMA for solid rocket propellants testing. Principles of operation and measured values are briefly described. The authors refer to the previous research of PTFE material and literature data providing information about proper experimental conditions and influence of measurement frequency, load amplitude, and heating rate on the results of DMA tests. The experimental results of solid double-base rocket propellant testing obtained on the N Netzsch DMA 242 device are presented. Mechanical properties such as the dynamic storage modulus E´, the dynamic loss modulus E˝ and tan(δ were measured within temperature range from (–120°C to (+90°C at the heating rate of 1 K/min. The test sample was subjected to a dual cantilever multi-frequency test. Special attention was paid to determination of the glass transition temperature of the tested propellant in reference to the NATO standardization agreement 4540 as well as influence of the measurement frequency on the glass transition.[b]Keywords[/b]: Dynamic mechanical analysis, solid rocket propellants, glass transition temperature

  16. Web-based analysis of the mouse transcriptome using Genevestigator

    Directory of Open Access Journals (Sweden)

    Gruissem Wilhelm

    2006-06-01

    Full Text Available Abstract Background Gene function analysis often requires a complex and laborious sequence of laboratory and computer-based experiments. Choosing an effective experimental design generally results from hypotheses derived from prior knowledge or experimentation. Knowledge obtained from meta-analyzing compendia of expression data with annotation libraries can provide significant clues in understanding gene and network function, resulting in better hypotheses that can be tested in the laboratory. Description Genevestigator is a microarray database and analysis system allowing context-driven queries. Simple but powerful tools allow biologists with little computational background to retrieve information about when, where and how genes are expressed. We manually curated and quality-controlled 3110 mouse Affymetrix arrays from public repositories. Data queries can be run against an annotation library comprising 160 anatomy categories, 12 developmental stage groups, 80 stimuli, and 182 genetic backgrounds or modifications. The quality of results obtained through Genevestigator is illustrated by a number of biological scenarios that are substantiated by other types of experimentation in the literature. Conclusion The Genevestigator-Mouse database effectively provides biologically meaningful results and can be accessed at https://www.genevestigator.ethz.ch.

  17. Aroma characterization based on aromatic series analysis in table grapes.

    Science.gov (United States)

    Wu, Yusen; Duan, Shuyan; Zhao, Liping; Gao, Zhen; Luo, Meng; Song, Shiren; Xu, Wenping; Zhang, Caixi; Ma, Chao; Wang, Shiping

    2016-08-04

    Aroma is an important part of quality in table grape, but the key aroma compounds and the aroma series of table grapes remains unknown. In this paper, we identified 67 aroma compounds in 20 table grape cultivars; 20 in pulp and 23 in skin were active compounds. C6 compounds were the basic background volatiles, but the aroma contents of pulp juice and skin depended mainly on the levels of esters and terpenes, respectively. Most obviously, 'Kyoho' grapevine series showed high contents of esters in pulp, while Muscat/floral cultivars showed abundant monoterpenes in skin. For the aroma series, table grapes were characterized mainly by herbaceous, floral, balsamic, sweet and fruity series. The simple and visualizable aroma profiles were established using aroma fingerprints based on the aromatic series. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) showed that the aroma profiles of pulp juice, skin and whole berries could be classified into 5, 3, and 5 groups, respectively. Combined with sensory evaluation, we could conclude that fatty and balsamic series were the preferred aromatic series, and the contents of their contributors (β-ionone and octanal) may be useful as indicators for the improvement of breeding and cultivation measures for table grapes.

  18. Entropy-based model for miRNA isoform analysis.

    Directory of Open Access Journals (Sweden)

    Shengqin Wang

    Full Text Available MiRNAs have been widely studied due to their important post-transcriptional regulatory roles in gene expression. Many reports have demonstrated the evidence of miRNA isoform products (isomiRs in high-throughput small RNA sequencing data. However, the biological function involved in these molecules is still not well investigated. Here, we developed a Shannon entropy-based model to estimate isomiR expression profiles of high-throughput small RNA sequencing data extracted from miRBase webserver. By using the Kolmogorov-Smirnov statistical test (KS test, we demonstrated that the 5p and 3p miRNAs present more variants than the single arm miRNAs. We also found that the isomiR variant, except the 3' isomiR variant, is strongly correlated with Minimum Free Energy (MFE of pre-miRNA, suggesting the intrinsic feature of pre-miRNA should be one of the important factors for the miRNA regulation. The functional enrichment analysis showed that the miRNAs with high variation, particularly the 5' end variation, are enriched in a set of critical functions, supporting these molecules should not be randomly produced. Our results provide a probabilistic framework for miRNA isoforms analysis, and give functional insights into pre-miRNA processing.

  19. [Environmental impacts of sewage treatment system based on emergy analysis].

    Science.gov (United States)

    Li, Min; Zhang, Xiao-Hong; Li, Yuan-Wei; Zhang, Hong; Zhao, Min; Deng, Shi-Huai

    2013-02-01

    "Integrated sewage treatment system" (ISTS) consists of sewage treatment plant system and their products (treated water and dewatered sludge) disposal facilities, which gives a holistic view of the whole sewage treatment process. During its construction and operation, ISTS has two main impacts on the environment, i.e., the consumption of resources and the damage of discharged pollutants on the environment, while the latter was usually ignored by the previous researchers when they assessed the impacts of wastewater treatment system. In order to more comprehensively understanding the impacts of sewage treatment on the environment, an analysis was made on the ISTS based on the theories of emergy analysis, and, in combining with ecological footprint theory, the sustainability of the ISTS was also analyzed. The results showed that the emergy of the impacts of water pollutants on the environment was far larger than that of the impacts of air pollutants, and NH3-N was the main responsible cause. The emergy consumption of ISTS mainly came from the emergy of wastewater and of local renewable resources. The "sewage treatment plant system + landfill system" had the highest emergy utilization efficiency, while the "sewage treatment plant system + reclaimed water reuse system + incineration system" had the lowest one. From the aspect of environmental sustainability, the "sewage treatment plant system + reclaimed water reuse system + landfill system" was the best ISTS, while the "sewage treatment plant system + incineration system" was the worst one.

  20. Identifying glioblastoma gene networks based on hypergeometric test analysis.

    Directory of Open Access Journals (Sweden)

    Vasileios Stathias

    Full Text Available Patient specific therapy is emerging as an important possibility for many cancer patients. However, to identify such therapies it is essential to determine the genomic and transcriptional alterations present in one tumor relative to control samples. This presents a challenge since use of a single sample precludes many standard statistical analysis techniques. We reasoned that one means of addressing this issue is by comparing transcriptional changes in one tumor with those observed in a large cohort of patients analyzed by The Cancer Genome Atlas (TCGA. To test this directly, we devised a bioinformatics pipeline to identify differentially expressed genes in tumors resected from patients suffering from the most common malignant adult brain tumor, glioblastoma (GBM. We performed RNA sequencing on tumors from individual GBM patients and filtered the results through the TCGA database in order to identify possible gene networks that are overrepresented in GBM samples relative to controls. Importantly, we demonstrate that hypergeometric-based analysis of gene pairs identifies gene networks that validate experimentally. These studies identify a putative workflow for uncovering differentially expressed patient specific genes and gene networks for GBM and other cancers.