WorldWideScience

Sample records for based model-free analysis

  1. Self-consistent residual dipolar coupling based model-free analysis for the robust determination of nanosecond to microsecond protein dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Lakomek, Nils-Alexander; Walter, Korvin F. A.; Fares, Christophe [Max-Planck Institute for Biophysical Chemistry, Department for NMR-based Structural Biology (Germany); Lange, Oliver F.; Groot, Bert L. de; Grubmueller, Helmut [Max-Planck Institute for Biophysical Chemistry, Department for Theoretical and Computational Biophysics (Germany); Brueschweiler, Rafael [Florida State University, NHFML (United States); Munk, Axel [University of Goettingen, Institut for Mathematical Stochastics (Germany); Becker, Stefan [Max-Planck Institute for Biophysical Chemistry, Department for NMR-based Structural Biology (Germany); Meiler, Jens [Vanderbilt University, Department of Chemistry, Center of Structural Biology (United States); Griesinger, Christian [Max-Planck Institute for Biophysical Chemistry, Department for NMR-based Structural Biology (Germany)], E-mail: cigr@nmr.mpibpc.mpg.de

    2008-07-15

    Residual dipolar couplings (RDCs) provide information about the dynamic average orientation of inter-nuclear vectors and amplitudes of motion up to milliseconds. They complement relaxation methods, especially on a time-scale window that we have called supra-{tau}{sub c} ({tau}{sub c} < supra-{tau}{sub c} < 50 {mu}s). Here we present a robust approach called Self-Consistent RDC-based Model-free analysis (SCRM) that delivers RDC-based order parameters-independent of the details of the structure used for alignment tensor calculation-as well as the dynamic average orientation of the inter-nuclear vectors in the protein structure in a self-consistent manner. For ubiquitin, the SCRM analysis yields an average RDC-derived order parameter of the NH vectors = 0.72 {+-} 0.02 compared to = 0.778 {+-} 0.003 for the Lipari-Szabo order parameters, indicating that the inclusion of the supra-{tau}{sub c} window increases the averaged amplitude of mobility observed in the sub-{tau}{sub c} window by about 34%. For the {beta}-strand spanned by residues Lys48 to Leu50, an alternating pattern of backbone NH RDC order parameter S{sub rdc}{sup 2} (NH) = (0.59, 0.72, 0.59) was extracted. The backbone of Lys48, whose side chain is known to be involved in the poly-ubiquitylation process that leads to protein degradation, is very mobile on the supra-{tau}{sub c} time scale (S{sub rdc}{sup 2} (NH) = 0.59 {+-} 0.03), while it is inconspicuous (S{sub LS}{sup 2} (NH) = 0.82) on the sub-{tau}{sub c} as well as on {mu}s-ms relaxation dispersion time scales. The results of this work differ from previous RDC dynamics studies of ubiquitin in the sense that the results are essentially independent of structural noise providing a much more robust assessment of dynamic effects that underlie the RDC data.

  2. Model-free execution monitoring in behavior-based robotics.

    Science.gov (United States)

    Pettersson, Ola; Karlsson, Lars; Saffiotti, Alessandro

    2007-08-01

    In the near future, autonomous mobile robots are expected to help humans by performing service tasks in many different areas, including personal assistance, transportation, cleaning, mining, or agriculture. In order to manage these tasks in a changing and partially unpredictable environment without the aid of humans, the robot must have the ability to plan its actions and to execute them robustly and safely. The robot must also have the ability to detect when the execution does not proceed as planned and to correctly identify the causes of the failure. An execution monitoring system allows the robot to detect and classify these failures. Most current approaches to execution monitoring in robotics are based on the idea of predicting the outcomes of the robot's actions by using some sort of predictive model and comparing the predicted outcomes with the observed ones. In contrary, this paper explores the use of model-free approaches to execution monitoring, that is, approaches that do not use predictive models. In this paper, we show that pattern recognition techniques can be applied to realize model-free execution monitoring by classifying observed behavioral patterns into normal or faulty execution. We investigate the use of several such techniques and verify their utility in a number of experiments involving the navigation of a mobile robot in indoor environments.

  3. Evaluation-Function-based Model-free Adaptive Fuzzy Control

    Directory of Open Access Journals (Sweden)

    Agus Naba

    2016-12-01

    Full Text Available Designs of adaptive fuzzy controllers (AFC are commonly based on the Lyapunov approach, which requires a known model of the controlled plant. They need to consider a Lyapunov function candidate as an evaluation function to be minimized. In this study these drawbacks were handled by designing a model-free adaptive fuzzy controller (MFAFC using an approximate evaluation function defined in terms of the current state, the next state, and the control action. MFAFC considers the approximate evaluation function as an evaluative control performance measure similar to the state-action value function in reinforcement learning. The simulation results of applying MFAFC to the inverted pendulum benchmark verified the proposed scheme’s efficacy.

  4. Benchmarking model-free and model-based optimal control

    NARCIS (Netherlands)

    Koryakovskiy, I.; Kudruss, M.; Babuska, R.; Caarls, W.; Kirches, Christian; Mombaur, Katja; Schlöder, Johannes P.; Vallery, H.

    2017-01-01

    Model-free reinforcement learning and nonlinear model predictive control are two different approaches for controlling a dynamic system in an optimal way according to a prescribed cost function. Reinforcement learning acquires a control policy through exploratory interaction with the system, while

  5. A novel model-free data analysis technique based on clustering in a mutual information space: application to resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Simon Benjaminsson

    2010-08-01

    Full Text Available Non-parametric data-driven analysis techniques can be used to study datasets with few assumptions about the data and underlying experiment. Variations of Independent Component Analysis (ICA have been the methods mostly used on fMRI data, e.g. in finding resting-state networks thought to reflect the connectivity of the brain. Here we present a novel data analysis technique and demonstrate it on resting-state fMRI data. It is a generic method with few underlying assumptions about the data. The results are built from the statistical relations between all input voxels, resulting in a whole-brain analysis on a voxel level. It has good scalability properties and the parallel implementation is capable of handling large datasets and databases. From the mutual information between the activities of the voxels over time, a distance matrix is created for all voxels in the input space. Multidimensional scaling is used to put the voxels in a lower-dimensional space reflecting the dependency relations based on the distance matrix. By performing clustering in this space we can find the strong statistical regularities in the data, which for the resting-state data turns out to be the resting-state networks. The decomposition is performed in the last step of the algorithm and is computationally simple. This opens up for rapid analysis and visualization of the data on different spatial levels, as well as automatically finding a suitable number of decomposition components.

  6. The effect of polymer matrices on the thermal hazard properties of RDX-based PBXs by using model-free and combined kinetic analysis.

    Science.gov (United States)

    Yan, Qi-Long; Zeman, Svatopluk; Sánchez Jiménez, P E; Zhao, Feng-Qi; Pérez-Maqueda, L A; Málek, Jiří

    2014-04-30

    In this paper, the decomposition reaction models and thermal hazard properties of 1,3,5-trinitro-1,3,5-triazinane (RDX) and its PBXs bonded by Formex P1, Semtex 1A, C4, Viton A and Fluorel polymer matrices have been investigated based on isoconversional and combined kinetic analysis methods. The established kinetic triplets are used to predict the constant decomposition rate temperature profiles, the critical radius for thermal explosion and isothermal behavior at a temperature of 82°C. It has been found that the effect of the polymer matrices on the decomposition mechanism of RDX is significant resulting in very different reaction models. The Formex P1, Semtex and C4 could make decomposition process of RDX follow a phase boundary controlled reaction mechanism, whereas the Viton A and Fluorel make its reaction model shifts to a two dimensional Avrami-Erofeev nucleation and growth model. According to isothermal simulations, the threshold cook-off time until loss of functionality at 82°C for RDX-C4 and RDX-FM is less than 500 days, while it is more than 700 days for the others. Unlike simulated isothermal curves, when considering the charge properties and heat of decomposition, RDX-FM and RDX-C4 are better than RDX-SE in storage safety at arbitrary surrounding temperature. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Pitchcontrol of wind turbines using model free adaptivecontrol based on wind turbine code

    DEFF Research Database (Denmark)

    Zhang, Yunqian; Chen, Zhe; Cheng, Ming

    2011-01-01

    value is only based on I/O data of the wind turbine is identified and then the wind turbine system is replaced by a dynamic linear time-varying model. In order to verify the correctness and robustness of the proposed model free adaptive pitch controller, the wind turbine code FAST which can predict...

  8. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task.

    Science.gov (United States)

    Skatova, Anya; Chan, Patricia A; Daw, Nathaniel D

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs. another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans' scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes.

  9. Model-free stochastic processes studied with q-wavelet-based informational tools

    International Nuclear Information System (INIS)

    Perez, D.G.; Zunino, L.; Martin, M.T.; Garavaglia, M.; Plastino, A.; Rosso, O.A.

    2007-01-01

    We undertake a model-free investigation of stochastic processes employing q-wavelet based quantifiers, that constitute a generalization of their Shannon counterparts. It is shown that (i) interesting physical information becomes accessible in such a way (ii) for special q values the quantifiers are more sensitive than the Shannon ones and (iii) there exist an implicit relationship between the Hurst parameter H and q within this wavelet framework

  10. Neural computations underlying arbitration between model-based and model-free learning

    Science.gov (United States)

    Lee, Sang Wan; Shimojo, Shinsuke; O’Doherty, John P.

    2014-01-01

    SUMMARY There is accumulating neural evidence to support the existence of two distinct systems for guiding action-selection in the brain, a deliberative “model-based” and a reflexive “model-free” system. However, little is known about how the brain determines which of these systems controls behavior at one moment in time. We provide evidence for an arbitration mechanism that allocates the degree of control over behavior by model-based and model-free systems as a function of the reliability of their respective predictions. We show that inferior lateral prefrontal and frontopolar cortex encode both reliability signals and the output of a comparison between those signals, implicating these regions in the arbitration process. Moreover, connectivity between these regions and model-free valuation areas is negatively modulated by the degree of model-based control in the arbitrator, suggesting that arbitration may work through modulation of the model-free valuation system when the arbitrator deems that the model-based system should drive behavior. PMID:24507199

  11. Variability in Dopamine Genes Dissociates Model-Based and Model-Free Reinforcement Learning.

    Science.gov (United States)

    Doll, Bradley B; Bath, Kevin G; Daw, Nathaniel D; Frank, Michael J

    2016-01-27

    Considerable evidence suggests that multiple learning systems can drive behavior. Choice can proceed reflexively from previous actions and their associated outcomes, as captured by "model-free" learning algorithms, or flexibly from prospective consideration of outcomes that might occur, as captured by "model-based" learning algorithms. However, differential contributions of dopamine to these systems are poorly understood. Dopamine is widely thought to support model-free learning by modulating plasticity in striatum. Model-based learning may also be affected by these striatal effects, or by other dopaminergic effects elsewhere, notably on prefrontal working memory function. Indeed, prominent demonstrations linking striatal dopamine to putatively model-free learning did not rule out model-based effects, whereas other studies have reported dopaminergic modulation of verifiably model-based learning, but without distinguishing a prefrontal versus striatal locus. To clarify the relationships between dopamine, neural systems, and learning strategies, we combine a genetic association approach in humans with two well-studied reinforcement learning tasks: one isolating model-based from model-free behavior and the other sensitive to key aspects of striatal plasticity. Prefrontal function was indexed by a polymorphism in the COMT gene, differences of which reflect dopamine levels in the prefrontal cortex. This polymorphism has been associated with differences in prefrontal activity and working memory. Striatal function was indexed by a gene coding for DARPP-32, which is densely expressed in the striatum where it is necessary for synaptic plasticity. We found evidence for our hypothesis that variations in prefrontal dopamine relate to model-based learning, whereas variations in striatal dopamine function relate to model-free learning. Decisions can stem reflexively from their previously associated outcomes or flexibly from deliberative consideration of potential choice outcomes

  12. Extraversion differentiates between model-based and model-free strategies in a reinforcement learning task

    Science.gov (United States)

    Skatova, Anya; Chan, Patricia A.; Daw, Nathaniel D.

    2013-01-01

    Prominent computational models describe a neural mechanism for learning from reward prediction errors, and it has been suggested that variations in this mechanism are reflected in personality factors such as trait extraversion. However, although trait extraversion has been linked to improved reward learning, it is not yet known whether this relationship is selective for the particular computational strategy associated with error-driven learning, known as model-free reinforcement learning, vs. another strategy, model-based learning, which the brain is also known to employ. In the present study we test this relationship by examining whether humans' scores on an extraversion scale predict individual differences in the balance between model-based and model-free learning strategies in a sequentially structured decision task designed to distinguish between them. In previous studies with this task, participants have shown a combination of both types of learning, but with substantial individual variation in the balance between them. In the current study, extraversion predicted worse behavior across both sorts of learning. However, the hypothesis that extraverts would be selectively better at model-free reinforcement learning held up among a subset of the more engaged participants, and overall, higher task engagement was associated with a more selective pattern by which extraversion predicted better model-free learning. The findings indicate a relationship between a broad personality orientation and detailed computational learning mechanisms. Results like those in the present study suggest an intriguing and rich relationship between core neuro-computational mechanisms and broader life orientations and outcomes. PMID:24027514

  13. Model-based and model-free “plug-and-play” building energy efficient control

    International Nuclear Information System (INIS)

    Baldi, Simone; Michailidis, Iakovos; Ravanis, Christos; Kosmatopoulos, Elias B.

    2015-01-01

    Highlights: • “Plug-and-play” Building Optimization and Control (BOC) driven by building data. • Ability to handle the large-scale and complex nature of the BOC problem. • Adaptation to learn the optimal BOC policy when no building model is available. • Comparisons with rule-based and advanced BOC strategies. • Simulation and real-life experiments in a ten-office building. - Abstract: Considerable research efforts in Building Optimization and Control (BOC) have been directed toward the development of “plug-and-play” BOC systems that can achieve energy efficiency without compromising thermal comfort and without the need of qualified personnel engaged in a tedious and time-consuming manual fine-tuning phase. In this paper, we report on how a recently introduced Parametrized Cognitive Adaptive Optimization – abbreviated as PCAO – can be used toward the design of both model-based and model-free “plug-and-play” BOC systems, with minimum human effort required to accomplish the design. In the model-based case, PCAO assesses the performance of its control strategy via a simulation model of the building dynamics; in the model-free case, PCAO optimizes its control strategy without relying on any model of the building dynamics. Extensive simulation and real-life experiments performed on a 10-office building demonstrate the effectiveness of the PCAO–BOC system in providing significant energy efficiency and improved thermal comfort. The mechanisms embedded within PCAO render it capable of automatically and quickly learning an efficient BOC strategy either in the presence of complex nonlinear simulation models of the building dynamics (model-based) or when no model for the building dynamics is available (model-free). Comparative studies with alternative state-of-the-art BOC systems show the effectiveness of the PCAO–BOC solution

  14. Model-Free Coordinated Control for MHTGR-Based Nuclear Steam Supply Systems

    Directory of Open Access Journals (Sweden)

    Zhe Dong

    2016-01-01

    Full Text Available The modular high temperature gas-cooled reactor (MHTGR is a typical small modular reactor (SMR that offers simpler, standardized and safer modular design by being factory built, requiring smaller initial capital investment, and having a shorter construction period. Thanks to its small size, the MHTGRs could be beneficial in providing electric power to remote areas that are deficient in transmission or distribution and in generating local power for large population centers. Based on the multi-modular operation scheme, the inherent safety feature of the MHTGRs can be applicable to large nuclear plants of any desired power rating. The MHTGR-based nuclear steam supplying system (NSSS is constituted by an MHTGR, a side-by-side arranged helical-coil once-through steam generator (OTSG and some connecting pipes. Due to the side-by-side arrangement, there is a tight coupling effect between the MHTGR and OTSG. Moreover, there always exists the parameter perturbation of the NSSSs. Thus, it is meaningful to study the model-free coordinated control of MHTGR-based NSSSs for safe, stable, robust and efficient operation. In this paper, a new model-free coordinated control strategy that regulates the nuclear power, MHTGR outlet helium temperature and OTSG outlet overheated steam temperature by properly adjusting the control rod position, helium flowrate and feed-water flowrate is established for the MHTGR-based NSSSs. Sufficient conditions for the globally asymptotic closed-loop stability is given. Finally, numerical simulation results in the cases of large range power decrease and increase illustrate the satisfactory performance of this newly-developed model-free coordinated NSSS control law.

  15. Model-free prediction and regression a transformation-based approach to inference

    CERN Document Server

    Politis, Dimitris N

    2015-01-01

    The Model-Free Prediction Principle expounded upon in this monograph is based on the simple notion of transforming a complex dataset to one that is easier to work with, e.g., i.i.d. or Gaussian. As such, it restores the emphasis on observable quantities, i.e., current and future data, as opposed to unobservable model parameters and estimates thereof, and yields optimal predictors in diverse settings such as regression and time series. Furthermore, the Model-Free Bootstrap takes us beyond point prediction in order to construct frequentist prediction intervals without resort to unrealistic assumptions such as normality. Prediction has been traditionally approached via a model-based paradigm, i.e., (a) fit a model to the data at hand, and (b) use the fitted model to extrapolate/predict future data. Due to both mathematical and computational constraints, 20th century statistical practice focused mostly on parametric models. Fortunately, with the advent of widely accessible powerful computing in the late 1970s, co...

  16. Integrating cortico-limbic-basal ganglia architectures for learning model-based and model-free navigation strategies.

    Science.gov (United States)

    Khamassi, Mehdi; Humphries, Mark D

    2012-01-01

    Behavior in spatial navigation is often organized into map-based (place-driven) vs. map-free (cue-driven) strategies; behavior in operant conditioning research is often organized into goal-directed vs. habitual strategies. Here we attempt to unify the two. We review one powerful theory for distinct forms of learning during instrumental conditioning, namely model-based (maintaining a representation of the world) and model-free (reacting to immediate stimuli) learning algorithms. We extend these lines of argument to propose an alternative taxonomy for spatial navigation, showing how various previously identified strategies can be distinguished as "model-based" or "model-free" depending on the usage of information and not on the type of information (e.g., cue vs. place). We argue that identifying "model-free" learning with dorsolateral striatum and "model-based" learning with dorsomedial striatum could reconcile numerous conflicting results in the spatial navigation literature. From this perspective, we further propose that the ventral striatum plays key roles in the model-building process. We propose that the core of the ventral striatum is positioned to learn the probability of action selection for every transition between states of the world. We further review suggestions that the ventral striatal core and shell are positioned to act as "critics" contributing to the computation of a reward prediction error for model-free and model-based systems, respectively.

  17. Model-Free Primitive-Based Iterative Learning Control Approach to Trajectory Tracking of MIMO Systems With Experimental Validation.

    Science.gov (United States)

    Radac, Mircea-Bogdan; Precup, Radu-Emil; Petriu, Emil M

    2015-11-01

    This paper proposes a novel model-free trajectory tracking of multiple-input multiple-output (MIMO) systems by the combination of iterative learning control (ILC) and primitives. The optimal trajectory tracking solution is obtained in terms of previously learned solutions to simple tasks called primitives. The library of primitives that are stored in memory consists of pairs of reference input/controlled output signals. The reference input primitives are optimized in a model-free ILC framework without using knowledge of the controlled process. The guaranteed convergence of the learning scheme is built upon a model-free virtual reference feedback tuning design of the feedback decoupling controller. Each new complex trajectory to be tracked is decomposed into the output primitives regarded as basis functions. The optimal reference input for the control system to track the desired trajectory is next recomposed from the reference input primitives. This is advantageous because the optimal reference input is computed straightforward without the need to learn from repeated executions of the tracking task. In addition, the optimization problem specific to trajectory tracking of square MIMO systems is decomposed in a set of optimization problems assigned to each separate single-input single-output control channel that ensures a convenient model-free decoupling. The new model-free primitive-based ILC approach is capable of planning, reasoning, and learning. A case study dealing with the model-free control tuning for a nonlinear aerodynamic system is included to validate the new approach. The experimental results are given.

  18. Landmark-based model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    van Dam, C.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan; Broemme, A.; Busch, C.

    2013-01-01

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence potentially useful video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm

  19. Model-free functional MRI analysis for detecting low-frequency functional connectivity in the human brain

    Science.gov (United States)

    Wismueller, Axel; Lange, Oliver; Auer, Dorothee; Leinsinger, Gerda

    2010-03-01

    Slowly varying temporally correlated activity fluctuations between functionally related brain areas have been identified by functional magnetic resonance imaging (fMRI) research in recent years. These low-frequency oscillations of less than 0.08 Hz appear to play a major role in various dynamic functional brain networks, such as the so-called 'default mode' network. They also have been observed as a property of symmetric cortices, and they are known to be present in the motor cortex among others. These low-frequency data are difficult to detect and quantify in fMRI. Traditionally, user-based regions of interests (ROI) or 'seed clusters' have been the primary analysis method. In this paper, we propose unsupervised clustering algorithms based on various distance measures to detect functional connectivity in resting state fMRI. The achieved results are evaluated quantitatively for different distance measures. The Euclidian metric implemented by standard unsupervised clustering approaches is compared with a non-metric topographic mapping of proximities based on the the mutual prediction error between pixel-specific signal dynamics time-series. It is shown that functional connectivity in the motor cortex of the human brain can be detected based on such model-free analysis methods for resting state fMRI.

  20. Oscillator-based assistance of cyclical movements: model-based and model-free approaches

    OpenAIRE

    Ronsse, Renaud; Lenzi, Tommaso; Vitiello, Nicola; Koopman, Bram; van Asseldonk, Edwin; De Rossi, Stefano Marco Maria; van den Kieboom, Jesse; van der Kooij, Herman; Carrozza, Maria Chiara; Ijspeert, Auke Jan

    2011-01-01

    In this article, we propose a new method for providing assistance during cyclical movements. This method is trajectory-free, in the sense that it provides user assistance irrespective of the performed movement, and requires no other sensing than the assisting robot's own encoders. The approach is based on adaptive oscillators, i.e., mathematical tools that are capable of learning the high level features (frequency, envelope, etc.) of a periodic input signal. Here we present two experiments th...

  1. Oscillator-based assistance of cyclical movements: model-based and model-free approaches.

    Science.gov (United States)

    Ronsse, Renaud; Lenzi, Tommaso; Vitiello, Nicola; Koopman, Bram; van Asseldonk, Edwin; De Rossi, Stefano Marco Maria; van den Kieboom, Jesse; van der Kooij, Herman; Carrozza, Maria Chiara; Ijspeert, Auke Jan

    2011-10-01

    In this article, we propose a new method for providing assistance during cyclical movements. This method is trajectory-free, in the sense that it provides user assistance irrespective of the performed movement, and requires no other sensing than the assisting robot's own encoders. The approach is based on adaptive oscillators, i.e., mathematical tools that are capable of learning the high level features (frequency, envelope, etc.) of a periodic input signal. Here we present two experiments that we recently conducted to validate our approach: a simple sinusoidal movement of the elbow, that we designed as a proof-of-concept, and a walking experiment. In both cases, we collected evidence illustrating that our approach indeed assisted healthy subjects during movement execution. Owing to the intrinsic periodicity of daily life movements involving the lower-limbs, we postulate that our approach holds promise for the design of innovative rehabilitation and assistance protocols for the lower-limb, requiring little to no user-specific calibration.

  2. Kinetics of the Thermal Degradation of Granulated Scrap Tyres: a Model-free Analysis

    Directory of Open Access Journals (Sweden)

    Félix A. LÓPEZ

    2013-12-01

    Full Text Available Pyrolysis is a technology with a promising future in the recycling of scrap tyres. This paper determines the thermal decomposition behaviour and kinetics of granulated scrap tyres (GST by examining the thermogravimetric/derivative thermogravimetric (TGA/DTG data obtained during their pyrolysis in an inert atmosphere at different heating rates. The model-free methods of Friedman, Flynn-Wall-Ozawa and Coats-Redfern were used to determine the reaction kinetics from the DTG data. The apparent activation energy and pre-exponential factor for the degradation of GST were calculated. A comparison with the results obtained by other authors was made.DOI: http://dx.doi.org/10.5755/j01.ms.19.4.2947

  3. Model-free method for isothermal and non-isothermal decomposition kinetics analysis of PET sample

    International Nuclear Information System (INIS)

    Saha, B.; Maiti, A.K.; Ghoshal, A.K.

    2006-01-01

    Pyrolysis, one possible alternative to recover valuable products from waste plastics, has recently been the subject of renewed interest. In the present study, the isoconversion methods, i.e., Vyazovkin model-free approach is applied to study non-isothermal decomposition kinetics of waste PET samples using various temperature integral approximations such as Coats and Redfern, Gorbachev, and Agrawal and Sivasubramanian approximation and direct integration (recursive adaptive Simpson quadrature scheme) to analyze the decomposition kinetics. The results show that activation energy (E α ) is a weak but increasing function of conversion (α) in case of non-isothermal decomposition and strong and decreasing function of conversion in case of isothermal decomposition. This indicates possible existence of nucleation, nuclei growth and gas diffusion mechanism during non-isothermal pyrolysis and nucleation and gas diffusion mechanism during isothermal pyrolysis. Optimum E α dependencies on α obtained for non-isothermal data showed similar nature for all the types of temperature integral approximations

  4. Model-free control

    Science.gov (United States)

    Fliess, Michel; Join, Cédric

    2013-12-01

    'Model-free control'and the corresponding 'intelligent' PID controllers (iPIDs), which already had many successful concrete applications, are presented here for the first time in an unified manner, where the new advances are taken into account. The basics of model-free control is now employing some old functional analysis and some elementary differential algebra. The estimation techniques become quite straightforward via a recent online parameter identification approach. The importance of iPIs and especially of iPs is deduced from the presence of friction. The strange industrial ubiquity of classic PIDs and the great difficulty for tuning them in complex situations is deduced, via an elementary sampling, from their connections with iPIDs. Several numerical simulations are presented which include some infinite-dimensional systems. They demonstrate not only the power of our intelligent controllers but also the great simplicity for tuning them.

  5. The "proactive" model of learning: Integrative framework for model-free and model-based reinforcement learning utilizing the associative learning-based proactive brain concept.

    Science.gov (United States)

    Zsuga, Judit; Biro, Klara; Papp, Csaba; Tajti, Gabor; Gesztelyi, Rudolf

    2016-02-01

    Reinforcement learning (RL) is a powerful concept underlying forms of associative learning governed by the use of a scalar reward signal, with learning taking place if expectations are violated. RL may be assessed using model-based and model-free approaches. Model-based reinforcement learning involves the amygdala, the hippocampus, and the orbitofrontal cortex (OFC). The model-free system involves the pedunculopontine-tegmental nucleus (PPTgN), the ventral tegmental area (VTA) and the ventral striatum (VS). Based on the functional connectivity of VS, model-free and model based RL systems center on the VS that by integrating model-free signals (received as reward prediction error) and model-based reward related input computes value. Using the concept of reinforcement learning agent we propose that the VS serves as the value function component of the RL agent. Regarding the model utilized for model-based computations we turned to the proactive brain concept, which offers an ubiquitous function for the default network based on its great functional overlap with contextual associative areas. Hence, by means of the default network the brain continuously organizes its environment into context frames enabling the formulation of analogy-based association that are turned into predictions of what to expect. The OFC integrates reward-related information into context frames upon computing reward expectation by compiling stimulus-reward and context-reward information offered by the amygdala and hippocampus, respectively. Furthermore we suggest that the integration of model-based expectations regarding reward into the value signal is further supported by the efferent of the OFC that reach structures canonical for model-free learning (e.g., the PPTgN, VTA, and VS). (c) 2016 APA, all rights reserved).

  6. Vision-Based Autonomous Underwater Vehicle Navigation in Poor Visibility Conditions Using a Model-Free Robust Control

    Directory of Open Access Journals (Sweden)

    Ricardo Pérez-Alcocer

    2016-01-01

    Full Text Available This paper presents a vision-based navigation system for an autonomous underwater vehicle in semistructured environments with poor visibility. In terrestrial and aerial applications, the use of visual systems mounted in robotic platforms as a control sensor feedback is commonplace. However, robotic vision-based tasks for underwater applications are still not widely considered as the images captured in this type of environments tend to be blurred and/or color depleted. To tackle this problem, we have adapted the lαβ color space to identify features of interest in underwater images even in extreme visibility conditions. To guarantee the stability of the vehicle at all times, a model-free robust control is used. We have validated the performance of our visual navigation system in real environments showing the feasibility of our approach.

  7. Language acquisition is model-based rather than model-free.

    Science.gov (United States)

    Wang, Felix Hao; Mintz, Toben H

    2016-01-01

    Christiansen & Chater (C&C) propose that learning language is learning to process language. However, we believe that the general-purpose prediction mechanism they propose is insufficient to account for many phenomena in language acquisition. We argue from theoretical considerations and empirical evidence that many acquisition tasks are model-based, and that different acquisition tasks require different, specialized models.

  8. Model-Free Visualization of Suspicious Lesions in Breast MRI Based on Supervised and Unsupervised Learning.

    Science.gov (United States)

    Twellmann, Thorsten; Meyer-Baese, Anke; Lange, Oliver; Foo, Simon; Nattkemper, Tim W

    2008-03-01

    Dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) has become an important tool in breast cancer diagnosis, but evaluation of multitemporal 3D image data holds new challenges for human observers. To aid the image analysis process, we apply supervised and unsupervised pattern recognition techniques for computing enhanced visualizations of suspicious lesions in breast MRI data. These techniques represent an important component of future sophisticated computer-aided diagnosis (CAD) systems and support the visual exploration of spatial and temporal features of DCE-MRI data stemming from patients with confirmed lesion diagnosis. By taking into account the heterogeneity of cancerous tissue, these techniques reveal signals with malignant, benign and normal kinetics. They also provide a regional subclassification of pathological breast tissue, which is the basis for pseudo-color presentations of the image data. Intelligent medical systems are expected to have substantial implications in healthcare politics by contributing to the diagnosis of indeterminate breast lesions by non-invasive imaging.

  9. Transcranial direct current stimulation of right dorsolateral prefrontal cortex does not affect model-based or model-free reinforcement learning in humans.

    Science.gov (United States)

    Smittenaar, Peter; Prichard, George; FitzGerald, Thomas H B; Diedrichsen, Joern; Dolan, Raymond J

    2014-01-01

    There is broad consensus that the prefrontal cortex supports goal-directed, model-based decision-making. Consistent with this, we have recently shown that model-based control can be impaired through transcranial magnetic stimulation of right dorsolateral prefrontal cortex in humans. We hypothesized that an enhancement of model-based control might be achieved by anodal transcranial direct current stimulation of the same region. We tested 22 healthy adult human participants in a within-subject, double-blind design in which participants were given Active or Sham stimulation over two sessions. We show Active stimulation had no effect on model-based control or on model-free ('habitual') control compared to Sham stimulation. These null effects are substantiated by a power analysis, which suggests that our study had at least 60% power to detect a true effect, and by a Bayesian model comparison, which favors a model of the data that assumes stimulation had no effect over models that assume stimulation had an effect on behavioral control. Although we cannot entirely exclude more trivial explanations for our null effect, for example related to (faults in) our experimental setup, these data suggest that anodal transcranial direct current stimulation over right dorsolateral prefrontal cortex does not improve model-based control, despite existing evidence that transcranial magnetic stimulation can disrupt such control in the same brain region.

  10. Textural features of dynamic contrast-enhanced MRI derived model-free and model-based parameter maps in glioma grading.

    Science.gov (United States)

    Xie, Tian; Chen, Xiao; Fang, Jingqin; Kang, Houyi; Xue, Wei; Tong, Haipeng; Cao, Peng; Wang, Sumei; Yang, Yizeng; Zhang, Weiguo

    2017-08-28

    Presurgical glioma grading by dynamic contrast-enhanced MRI (DCE-MRI) has unresolved issues. The aim of this study was to investigate the ability of textural features derived from pharmacokinetic model-based or model-free parameter maps of DCE-MRI in discriminating between different grades of gliomas, and their correlation with pathological index. Retrospective. Forty-two adults with brain gliomas. 3.0T, including conventional anatomic sequences and DCE-MRI sequences (variable flip angle T1-weighted imaging and three-dimensional gradient echo volumetric imaging). Regions of interest on the cross-sectional images with maximal tumor lesion. Five commonly used textural features, including Energy, Entropy, Inertia, Correlation, and Inverse Difference Moment (IDM), were generated. All textural features of model-free parameters (initial area under curve [IAUC], maximal signal intensity [Max SI], maximal up-slope [Max Slope]) could effectively differentiate between grade II (n = 15), grade III (n = 13), and grade IV (n = 14) gliomas (P IDM, of four DCE-MRI parameters, including Max SI, Max Slope (model-free parameters), vp (Extended Tofts), and vp (Patlak) could differentiate grade III and IV gliomas (P IDM of Patlak-based K trans and vp could differentiate grade II (n = 15) from III (n = 13) gliomas (P IDM of Extended Tofts- and Patlak-based vp showed highest area under curve in discriminating between grade III and IV gliomas. However, intraclass correlation coefficient (ICC) of these features revealed relatively lower inter-observer agreement. No significant correlation was found between microvascular density and textural features, compared with a moderate correlation found between cellular proliferation index and those features. Textural features of DCE-MRI parameter maps displayed a good ability in glioma grading. 3 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2017. © 2017 International Society for Magnetic Resonance in Medicine.

  11. Systematic errors in detecting biased agonism: Analysis of current methods and development of a new model-free approach.

    Science.gov (United States)

    Onaran, H Ongun; Ambrosio, Caterina; Uğur, Özlem; Madaras Koncz, Erzsebet; Grò, Maria Cristina; Vezzi, Vanessa; Rajagopal, Sudarshan; Costa, Tommaso

    2017-03-14

    Discovering biased agonists requires a method that can reliably distinguish the bias in signalling due to unbalanced activation of diverse transduction proteins from that of differential amplification inherent to the system being studied, which invariably results from the non-linear nature of biological signalling networks and their measurement. We have systematically compared the performance of seven methods of bias diagnostics, all of which are based on the analysis of concentration-response curves of ligands according to classical receptor theory. We computed bias factors for a number of β-adrenergic agonists by comparing BRET assays of receptor-transducer interactions with Gs, Gi and arrestin. Using the same ligands, we also compared responses at signalling steps originated from the same receptor-transducer interaction, among which no biased efficacy is theoretically possible. In either case, we found a high level of false positive results and a general lack of correlation among methods. Altogether this analysis shows that all tested methods, including some of the most widely used in the literature, fail to distinguish true ligand bias from "system bias" with confidence. We also propose two novel semi quantitative methods of bias diagnostics that appear to be more robust and reliable than currently available strategies.

  12. Can model-free reinforcement learning explain deontological moral judgments?

    Science.gov (United States)

    Ayars, Alisabeth

    2016-05-01

    Dual-systems frameworks propose that moral judgments are derived from both an immediate emotional response, and controlled/rational cognition. Recently Cushman (2013) proposed a new dual-system theory based on model-free and model-based reinforcement learning. Model-free learning attaches values to actions based on their history of reward and punishment, and explains some deontological, non-utilitarian judgments. Model-based learning involves the construction of a causal model of the world and allows for far-sighted planning; this form of learning fits well with utilitarian considerations that seek to maximize certain kinds of outcomes. I present three concerns regarding the use of model-free reinforcement learning to explain deontological moral judgment. First, many actions that humans find aversive from model-free learning are not judged to be morally wrong. Moral judgment must require something in addition to model-free learning. Second, there is a dearth of evidence for central predictions of the reinforcement account-e.g., that people with different reinforcement histories will, all else equal, make different moral judgments. Finally, to account for the effect of intention within the framework requires certain assumptions which lack support. These challenges are reasonable foci for future empirical/theoretical work on the model-free/model-based framework. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Model-free stabilization by extremum seeking

    CERN Document Server

    Scheinker, Alexander

    2017-01-01

    With this brief, the authors present algorithms for model-free stabilization of unstable dynamic systems. An extremum-seeking algorithm assigns the role of a cost function to the dynamic system’s control Lyapunov function (clf) aiming at its minimization. The minimization of the clf drives the clf to zero and achieves asymptotic stabilization. This approach does not rely on, or require knowledge of, the system model. Instead, it employs periodic perturbation signals, along with the clf. The same effect is achieved as by using clf-based feedback laws that profit from modeling knowledge, but in a time-average sense. Rather than use integrals of the systems vector field, we employ Lie-bracket-based (i.e., derivative-based) averaging. The brief contains numerous examples and applications, including examples with unknown control directions and experiments with charged particle accelerators. It is intended for theoretical control engineers and mathematicians, and practitioners working in various industrial areas ...

  14. Is there any correlation between model-based perfusion parameters and model-free parameters of time-signal intensity curve on dynamic contrast enhanced MRI in breast cancer patients?

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Boram; Kang, Doo Kyoung; Kim, Tae Hee [Ajou University School of Medicine, Department of Radiology, Suwon, Gyeonggi-do (Korea, Republic of); Yoon, Dukyong [Ajou University School of Medicine, Department of Biomedical Informatics, Suwon (Korea, Republic of); Jung, Yong Sik; Kim, Ku Sang [Ajou University School of Medicine, Department of Surgery, Suwon (Korea, Republic of); Yim, Hyunee [Ajou University School of Medicine, Department of Pathology, Suwon (Korea, Republic of)

    2014-05-15

    To find out any correlation between dynamic contrast-enhanced (DCE) model-based parameters and model-free parameters, and evaluate correlations between perfusion parameters with histologic prognostic factors. Model-based parameters (Ktrans, Kep and Ve) of 102 invasive ductal carcinomas were obtained using DCE-MRI and post-processing software. Correlations between model-based and model-free parameters and between perfusion parameters and histologic prognostic factors were analysed. Mean Kep was significantly higher in cancers showing initial rapid enhancement (P = 0.002) and a delayed washout pattern (P = 0.001). Ve was significantly lower in cancers showing a delayed washout pattern (P = 0.015). Kep significantly correlated with time to peak enhancement (TTP) (ρ = -0.33, P < 0.001) and washout slope (ρ = 0.39, P = 0.002). Ve was significantly correlated with TTP (ρ = 0.33, P = 0.002). Mean Kep was higher in tumours with high nuclear grade (P = 0.017). Mean Ve was lower in tumours with high histologic grade (P = 0.005) and in tumours with negative oestrogen receptor status (P = 0.047). TTP was shorter in tumours with negative oestrogen receptor status (P = 0.037). We could acquire general information about the tumour vascular physiology, interstitial space volume and pathologic prognostic factors by analyzing time-signal intensity curve without a complicated acquisition process for the model-based parameters. (orig.)

  15. Model-free thermodynamics of fluid vesicles.

    Science.gov (United States)

    Diamant, Haim

    2011-12-01

    Motivated by a long-standing debate concerning the nature and interrelations of surface-tension variables in fluid membranes, we reformulate the thermodynamics of a membrane vesicle as a generic two-dimensional finite system enclosing a three-dimensional volume. The formulation is shown to require two tension variables, conjugate to the intensive constraints of area per molecule and volume-to-area ratio. We obtain the relation between these two variables in various scenarios, as well as their correspondence to other definitions of tension variables for membranes. Several controversies related to membrane tension are thereby resolved on a model-free thermodynamic level. The thermodynamic formulation may be useful also for treating large-scale properties of vesicles that are insensitive to the membrane's detailed statistical mechanics and interactions.

  16. Hyper-chaos encryption using convolutional masking and model free unmasking

    International Nuclear Information System (INIS)

    Qi Guo-Yuan; Matondo Sandra Bazebo

    2014-01-01

    In this paper, during the masking process the encrypted message is convolved and embedded into a Qi hyper-chaotic system characterizing a high disorder degree. The masking scheme was tested using both Qi hyper-chaos and Lorenz chaos and indicated that Qi hyper-chaos based masking can resist attacks of the filtering and power spectrum analysis, while the Lorenz based scheme fails for high amplitude data. To unmask the message at the receiving end, two methods are proposed. In the first method, a model-free synchronizer, i.e. a multivariable higher-order differential feedback controller between the transmitter and receiver is employed to de-convolve the message embedded in the receiving signal. In the second method, no synchronization is required since the message is de-convolved using the information of the estimated derivative. (general)

  17. Model-free adaptive sliding mode controller design for generalized ...

    Indian Academy of Sciences (India)

    L M WANG

    2017-08-16

    Aug 16, 2017 ... A novel model-free adaptive sliding mode strategy is proposed for a generalized projective synchronization (GPS) ... the neural network theory, a model-free adaptive sliding mode controller is designed to guarantee asymptotic stability of the generalized ..... following optimization parameters are needed: ⎧.

  18. A Model-Free Diagnostic for Single-Peakedness of Item Responses Using Ordered Conditional Means

    Science.gov (United States)

    Polak, Marike; De Rooij, Mark; Heiser, Willem J.

    2012-01-01

    In this article we propose a model-free diagnostic for single-peakedness (unimodality) of item responses. Presuming a unidimensional unfolding scale and a given item ordering, we approximate item response functions of all items based on ordered conditional means (OCM). The proposed OCM methodology is based on Thurstone & Chave's (1929) "criterion…

  19. Model-free 3D face shape reconstruction from video sequences

    NARCIS (Netherlands)

    van Dam, C.; Veldhuis, Raymond N.J.; Spreeuwers, Lieuwe Jan

    In forensic comparison of facial video data, often only the best quality frontal face frames are selected, and hence much video data is ignored. To improve 2D facial comparison for law enforcement and forensic investigation, we introduce a model-free 3D shape reconstruction algorithm based on 2D

  20. Formation of model-free motor memories during motor adaptation depends on perturbation schedule.

    Science.gov (United States)

    Orban de Xivry, Jean-Jacques; Lefèvre, Philippe

    2015-04-01

    Motor adaptation to an external perturbation relies on several mechanisms such as model-based, model-free, strategic, or repetition-dependent learning. Depending on the experimental conditions, each of these mechanisms has more or less weight in the final adaptation state. Here we focused on the conditions that lead to the formation of a model-free motor memory (Huang VS, Haith AM, Mazzoni P, Krakauer JW. Neuron 70: 787-801, 2011), i.e., a memory that does not depend on an internal model or on the size or direction of the errors experienced during the learning. The formation of such model-free motor memory was hypothesized to depend on the schedule of the perturbation (Orban de Xivry JJ, Ahmadi-Pajouh MA, Harran MD, Salimpour Y, Shadmehr R. J Neurophysiol 109: 124-136, 2013). Here we built on this observation by directly testing the nature of the motor memory after abrupt or gradual introduction of a visuomotor rotation, in an experimental paradigm where the presence of model-free motor memory can be identified (Huang VS, Haith AM, Mazzoni P, Krakauer JW. Neuron 70: 787-801, 2011). We found that relearning was faster after abrupt than gradual perturbation, which suggests that model-free learning is reduced during gradual adaptation to a visuomotor rotation. In addition, the presence of savings after abrupt introduction of the perturbation but gradual extinction of the motor memory suggests that unexpected errors are necessary to induce a model-free motor memory. Overall, these data support the hypothesis that different perturbation schedules do not lead to a more or less stabilized motor memory but to distinct motor memories with different attributes and neural representations. Copyright © 2015 the American Physiological Society.

  1. Self-consistent residual dipolar coupling based model-free analysis for the robust determination of nanosecond to microsecond protein dynamics

    International Nuclear Information System (INIS)

    Lakomek, Nils-Alexander; Walter, Korvin F. A.; Fares, Christophe; Lange, Oliver F.; Groot, Bert L. de; Grubmueller, Helmut; Brueschweiler, Rafael; Munk, Axel; Becker, Stefan; Meiler, Jens; Griesinger, Christian

    2008-01-01

    Residual dipolar couplings (RDCs) provide information about the dynamic average orientation of inter-nuclear vectors and amplitudes of motion up to milliseconds. They complement relaxation methods, especially on a time-scale window that we have called supra-τ c (τ c c rdc > = 0.72 ± 0.02 compared to LS 2 > = 0.778 ± 0.003 for the Lipari-Szabo order parameters, indicating that the inclusion of the supra-τ c window increases the averaged amplitude of mobility observed in the sub-τ c window by about 34%. For the β-strand spanned by residues Lys48 to Leu50, an alternating pattern of backbone NH RDC order parameter S rdc 2 (NH) = (0.59, 0.72, 0.59) was extracted. The backbone of Lys48, whose side chain is known to be involved in the poly-ubiquitylation process that leads to protein degradation, is very mobile on the supra-τ c time scale (S rdc 2 (NH) = 0.59 ± 0.03), while it is inconspicuous (S LS 2 (NH) = 0.82) on the sub-τ c as well as on μs-ms relaxation dispersion time scales. The results of this work differ from previous RDC dynamics studies of ubiquitin in the sense that the results are essentially independent of structural noise providing a much more robust assessment of dynamic effects that underlie the RDC data

  2. Model-free adaptive sliding mode controller design for generalized ...

    Indian Academy of Sciences (India)

    L M WANG

    2017-08-16

    Aug 16, 2017 ... Abstract. A novel model-free adaptive sliding mode strategy is proposed for a generalized projective synchronization (GPS) between two entirely unknown fractional-order chaotic systems subject to the external disturbances. To solve the difficulties from the little knowledge about the master–slave system ...

  3. Stress enhances model-free reinforcement learning only after negative outcome.

    Science.gov (United States)

    Park, Heyeon; Lee, Daeyeol; Chey, Jeanyung

    2017-01-01

    Previous studies found that stress shifts behavioral control by promoting habits while decreasing goal-directed behaviors during reward-based decision-making. It is, however, unclear how stress disrupts the relative contribution of the two systems controlling reward-seeking behavior, i.e. model-free (or habit) and model-based (or goal-directed). Here, we investigated whether stress biases the contribution of model-free and model-based reinforcement learning processes differently depending on the valence of outcome, and whether stress alters the learning rate, i.e., how quickly information from the new environment is incorporated into choices. Participants were randomly assigned to either a stress or a control condition, and performed a two-stage Markov decision-making task in which the reward probabilities underwent periodic reversals without notice. We found that stress increased the contribution of model-free reinforcement learning only after negative outcome. Furthermore, stress decreased the learning rate. The results suggest that stress diminishes one's ability to make adaptive choices in multiple aspects of reinforcement learning. This finding has implications for understanding how stress facilitates maladaptive habits, such as addictive behavior, and other dysfunctional behaviors associated with stress in clinical and educational contexts.

  4. On the model-free control of an experimental greenhouse

    OpenAIRE

    Lafont, Frédéric; Pessel, Nathalie; Balmat, Jean-François; Fliess, Michel

    2013-01-01

    International audience; In spite of a large technical literature, an efficient climate control of a greenhouse remains a very difficult task. Indeed, this process is a complex nonlinear system with strong meteorological disturbances. The newly introduced ''model-free control'' setting is employed here. It is easy to implement, and has already shown excellent performances in many other concrete domains. Successful experimental tests are presented and discussed. They are compared to a Boolean a...

  5. Model-free adaptive control of advanced power plants

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L.; Wang, Qiang

    2015-08-18

    A novel 3-Input-3-Output (3.times.3) Model-Free Adaptive (MFA) controller with a set of artificial neural networks as part of the controller is introduced. A 3.times.3 MFA control system using the inventive 3.times.3 MFA controller is described to control key process variables including Power, Steam Throttle Pressure, and Steam Temperature of boiler-turbine-generator (BTG) units in conventional and advanced power plants. Those advanced power plants may comprise Once-Through Supercritical (OTSC) Boilers, Circulating Fluidized-Bed (CFB) Boilers, and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  6. Model-Free Trajectory Optimisation for Unmanned Aircraft Serving as Data Ferries for Widespread Sensors

    Directory of Open Access Journals (Sweden)

    Ben Pearre

    2012-10-01

    Full Text Available Given multiple widespread stationary data sources such as ground-based sensors, an unmanned aircraft can fly over the sensors and gather the data via a wireless link. Performance criteria for such a network may incorporate costs such as trajectory length for the aircraft or the energy required by the sensors for radio transmission. Planning is hampered by the complex vehicle and communication dynamics and by uncertainty in the locations of sensors, so we develop a technique based on model-free learning. We present a stochastic optimisation method that allows the data-ferrying aircraft to optimise data collection trajectories through an unknown environment in situ, obviating the need for system identification. We compare two trajectory representations, one that learns near-optimal trajectories at low data requirements but that fails at high requirements, and one that gives up some performance in exchange for a data collection guarantee. With either encoding the ferry is able to learn significantly improved trajectories compared with alternative heuristics. To demonstrate the versatility of the model-free learning approach, we also learn a policy to minimise the radio transmission energy required by the sensor nodes, allowing prolonged network lifetime.

  7. A generalized mean-squared displacement from inelastic fixed window scans of incoherent neutron scattering as a model-free indicator of anomalous diffusion confinement

    International Nuclear Information System (INIS)

    Roosen-Runge, F.; Seydel, T.

    2015-01-01

    Elastic fixed window scans of incoherent neutron scattering are an established and frequently employed method to study dynamical changes, usually over a broad temperature range or during a process such as a conformational change in the sample. In particular, the apparent mean-squared displacement can be extracted via a model-free analysis based on a solid physical interpretation as an effective amplitude of molecular motions. Here, we provide a new account of elastic and inelastic fixed window scans, defining a generalized mean-squared displacement for all fixed energy transfers. We show that this generalized mean-squared displacement in principle contains all information on the real mean-square displacement accessible in the instrumental time window. The derived formula provides a clear understanding of the effects of instrumental resolution on the apparent mean-squared displacement. Finally, we show that the generalized mean-square displacement can be used as a model-free indicator on confinement effects within the instrumental time window. (authors)

  8. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    -and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  9. Model-free approach to the estimation of radiation hazards. I. Theory

    International Nuclear Information System (INIS)

    Zaider, M.; Brenner, D.J.

    1986-01-01

    The experience of the Japanese atomic bomb survivors constitutes to date the major data base for evaluating the effects of low doses of ionizing radiation on human populations. Although numerous analyses have been performed and published concerning this experience, it is clear that no consensus has emerged as to the conclusions that may be drawn to assist in setting realistic radiation protection guidelines. In part this is an inherent consequences of the rather limited amount of data available. In this paper the authors address an equally important problem; namely, the use of arbitrary parametric risk models which have little theoretical foundation, yet almost totally determine the final conclusions drawn. They propose the use of a model-free approach to the estimation of radiation hazards

  10. Base compaction specification feasibility analysis.

    Science.gov (United States)

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  11. Model-free reinforcement learning operates over information stored in working-memory to drive human choices

    OpenAIRE

    Feher da Silva, Carolina; Yao, Yuan-Wei; Hare, Todd A

    2017-01-01

    Model-free learning creates stimulus-response associations, but are there limits to the types of stimuli it can operate over? Most experiments on reward-learning have used discrete sensory stimuli, but there is no algorithmic reason to restrict model-free learning to external stimuli, and theories suggest that model-free processes may operate over highly abstract concepts and goals. Our study aimed to determine whether model-free learning can operate over environmental states defined by infor...

  12. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  13. Thermal characterization and model free kinetics of aged epoxies and foams using TGA and DSC methods.

    Energy Technology Data Exchange (ETDEWEB)

    Cordaro, Joseph Gabriel; Kruizenga, Alan Michael; Nissen, April

    2013-10-01

    Two classes of materials, poly(methylene diphenyl diisocyanate) or PMDI foam, and cross-linked epoxy resins, were characterized using thermal gravimetric analysis (TGA) and differential scanning calorimetry (DSC), to help understand the effects of aging and %E2%80%9Cbake-out%E2%80%9D. The materials were evaluated for mass loss and the onset of decomposition. In some experiments, volatile materials released during heating were analyzed via mass spectroscopy. In all, over twenty materials were evaluated to compare the mass loss and onset temperature for decomposition. Model free kinetic (MFK) measurements, acquired using variable heating rate TGA experiments, were used to calculate the apparent activation energy of thermal decomposition. From these compiled data the effects of aging, bake-out, and sample history on the thermal stability of materials were compared. No significant differences between aged and unaged materials were detected. Bake-out did slightly affect the onset temperature of decomposition but only at the highest bake-out temperatures. Finally, some recommendations for future handling are made.

  14. A Fluid Structure Algorithm with Lagrange Multipliers to Model Free Swimming

    Science.gov (United States)

    Sahin, Mehmet; Dilek, Ezgi

    2017-11-01

    A new monolithic approach is prosed to solve the fluid-structure interaction (FSI) problem with Lagrange multipliers in order to model free swimming/flying. In the present approach, the fluid domain is modeled by the incompressible Navier-Stokes equations and discretized using an Arbitrary Lagrangian-Eulerian (ALE) formulation based on the stable side-centered unstructured finite volume method. The solid domain is modeled by the constitutive laws for the nonlinear Saint Venant-Kirchhoff material and the classical Galerkin finite element method is used to discretize the governing equations in a Lagrangian frame. In order to impose the body motion/deformation, the distance between the constraint pair nodes is imposed using the Lagrange multipliers, which is independent from the frame of reference. The resulting algebraic linear equations are solved in a fully coupled manner using a dual approach (null space method). The present numerical algorithm is initially validated for the classical FSI benchmark problems and then applied to the free swimming of three linked ellipses. The authors are grateful for the use of the computing resources provided by the National Center for High Performance Computing (UYBHM) under Grant Number 10752009 and the computing facilities at TUBITAK-ULAKBIM, High Performance and Grid Computing Center.

  15. Relation chain based clustering analysis

    Science.gov (United States)

    Zhang, Cheng-ning; Zhao, Ming-yang; Luo, Hai-bo

    2011-08-01

    Clustering analysis is currently one of well-developed branches in data mining technology which is supposed to find the hidden structures in the multidimensional space called feature or pattern space. A datum in the space usually possesses a vector form and the elements in the vector represent several specifically selected features. These features are often of efficiency to the problem oriented. Generally, clustering analysis goes into two divisions: one is based on the agglomerative clustering method, and the other one is based on divisive clustering method. The former refers to a bottom-up process which regards each datum as a singleton cluster while the latter refers to a top-down process which regards entire data as a cluster. As the collected literatures, it is noted that the divisive clustering is currently overwhelming both in application and research. Although some famous divisive clustering methods are designed and well developed, clustering problems are still far from being solved. The k - means algorithm is the original divisive clustering method which initially assigns some important index values, such as the clustering number and the initial clustering prototype positions, and that could not be reasonable in some certain occasions. More than the initial problem, the k - means algorithm may also falls into local optimum, clusters in a rigid way and is not available for non-Gaussian distribution. One can see that seeking for a good or natural clustering result, in fact, originates from the one's understanding of the concept of clustering. Thus, the confusion or misunderstanding of the definition of clustering always derives some unsatisfied clustering results. One should consider the definition deeply and seriously. This paper demonstrates the nature of clustering, gives the way of understanding clustering, discusses the methodology of designing a clustering algorithm, and proposes a new clustering method based on relation chains among 2D patterns. In

  16. Oscillator-based assistance of cyclical movements: model-based and model-free approaches

    NARCIS (Netherlands)

    Ronsse, Renaud; Lenzi, Tommaso; Vitiello, Nicola; Koopman, Bram; van Asseldonk, Edwin H.F.; de Rossi, Stefano Marco Maria; van den Kieboom, Jesse; van der Kooij, Herman; Carozza, Maria Chiara; IJspeert, Auke Jan

    2011-01-01

    In this article, we propose a new method for providing assistance during cyclical movements. This method is trajectory-free, in the sense that it provides user assistance irrespective of the performed movement, and requires no other sensing than the assisting robot’s own encoders. The approach is

  17. Model-free adaptive control optimization using a chaotic particle swarm approach

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br; Rodrigues Coelho, Antonio Augusto [Department of Automation and Systems, Federal University of Santa Catarina, Box 476, 88040-900 Florianopolis, Santa Catarina (Brazil)], E-mail: aarc@das.ufsc.br

    2009-08-30

    It is well known that conventional control theories are widely suited for applications where the processes can be reasonably described in advance. However, when the plant's dynamics are hard to characterize precisely or are subject to environmental uncertainties, one may encounter difficulties in applying the conventional controller design methodologies. Despite the difficulty in achieving high control performance, the fine tuning of controller parameters is a tedious task that always requires experts with knowledge in both control theory and process information. Nowadays, more and more studies have focused on the development of adaptive control algorithms that can be directly applied to complex processes whose dynamics are poorly modeled and/or have severe nonlinearities. In this context, the design of a Model-Free Learning Adaptive Control (MFLAC) based on pseudo-gradient concepts and optimization procedure by a Particle Swarm Optimization (PSO) approach using constriction coefficient and Henon chaotic sequences (CPSOH) is presented in this paper. PSO is a stochastic global optimization technique inspired by social behavior of bird flocking. The PSO models the exploration of a problem space by a population of particles. Each particle in PSO has a randomized velocity associated to it, which moves through the space of the problem. Since chaotic mapping enjoys certainty, ergodicity and the stochastic property, the proposed CPSOH introduces chaos mapping which introduces some flexibility in particle movements in each iteration. The chaotic sequences allow also explorations at early stages and exploitations at later stages during the search procedure of CPSOH. Motivation for application of CPSOH approach is to overcome the limitation of the conventional MFLAC design, which cannot guarantee satisfactory control performance when the plant has different gains for the operational range when designed by trial-and-error by user. Numerical results of the MFLAC with

  18. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  19. Data-Driven Model-Free Adaptive Control of Particle Quality in Drug Development Phase of Spray Fluidized-Bed Granulation Process

    Directory of Open Access Journals (Sweden)

    Zhengsong Wang

    2017-01-01

    Full Text Available A novel data-driven model-free adaptive control (DDMFAC approach is first proposed by combining the advantages of model-free adaptive control (MFAC and data-driven optimal iterative learning control (DDOILC, and then its stability and convergence analysis is given to prove algorithm stability and asymptotical convergence of tracking error. Besides, the parameters of presented approach are adaptively adjusted with fuzzy logic to determine the occupied proportions of MFAC and DDOILC according to their different control performances in different control stages. Lastly, the proposed fuzzy DDMFAC (FDDMFAC approach is applied to the control of particle quality in drug development phase of spray fluidized-bed granulation process (SFBGP, and its control effect is compared with MFAC and DDOILC and their fuzzy forms, in which the parameters of MFAC and DDOILC are adaptively adjusted with fuzzy logic. The effectiveness of the presented FDDMFAC approach is verified by a series of simulations.

  20. Simple Model-Free Controller for the Stabilization of Planetary Inverted Pendulum

    Directory of Open Access Journals (Sweden)

    Huanhuan Mai

    2012-01-01

    Full Text Available A simple model-free controller is presented for solving the nonlinear dynamic control problems. As an example of the problem, a planetary gear-type inverted pendulum (PIP is discussed. To control the inherently unstable system which requires real-time control responses, the design of a smart and simple controller is made necessary. The model-free controller proposed includes a swing-up controller part and a stabilization controller part; neither controller has any information about the PIP. Since the input/output scaling parameters of the fuzzy controller are highly sensitive, we use genetic algorithm (GA to obtain the optimal control parameters. The experimental results show the effectiveness and robustness of the present controller.

  1. Model-free control and fault accommodation for an experimental greenhouse

    OpenAIRE

    Lafont, Frédéric; Balmat, Jean-François; Pessel, Nathalie; Fliess, Michel

    2014-01-01

    International audience; The greenhouse climate control is important in modern agriculture. It is also rather difficult to design: as a matter of fact writing down a "good" mathematical model, which takes into account strong meteorological disturbances, might be an impossible task. The control is here synthesized via a new "model-free" setting, which yields an "intelligent" proportional feedback controller, the tuning of which is straightforward, and even simpler than the intelligent proportio...

  2. A model-free control strategy for an experimental greenhouse with an application to fault accommodation

    OpenAIRE

    Lafont, Frédéric; Balmat, Jean-François; Pessel, Nathalie; Fliess, Michel

    2014-01-01

    International audience; Writing down mathematical models of agricultural greenhouses and regulating them via advanced controllers are challenging tasks since strong perturbations, like meteorological variations, have to be taken into account. This is why we are developing here a new model-free control approach and the corresponding intelligent controllers, where the need of a good model disappears. This setting, which has been introduced quite recently and is easy to implement, is already suc...

  3. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

    Low-frequency modulation of sound carry important information for speech and music. The modulation spectrum i commonly obtained by spectral analysis of the sole temporal envelopes of the sub-bands out of a time-frequency analysis. Processing in this domain usually creates undesirable distortions...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  4. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    Since the acoustic characteristics of ejective sounds differ from the corresponding voiced and voiceless pulmonic sound conjugates, mainly in the source of excitation, epoch-based analysis is useful, in addition to the spectral or spectrographic analysis. The most frequently used features for analysis of ejective sounds are: ...

  5. Model-free tests of equality in binary data under an incomplete block design.

    Science.gov (United States)

    Lui, Kung-Jong; Zhu, Lixia

    2018-02-16

    Using Prescott's model-free approach, we develop an asymptotic procedure and an exact procedure for testing equality between treatments with binary responses under an incomplete block crossover design. We employ Monte Carlo simulation and note that these test procedures can not only perform well in small-sample cases but also outperform the corresponding test procedures accounting for only patients with discordant responses published elsewhere. We use the data taken as a part of the crossover trial comparing two different doses of an analgesic with placebo for the relief of primary dysmenorrhea to illustrate the use of test procedures discussed here.

  6. Model-free polarized neutron diffraction study of an acentric crystal: Metamagnetic UCoAl

    International Nuclear Information System (INIS)

    Papoular, R.J.; Delapalme, A.

    1994-01-01

    For the first time, a model-free procedure is developed to analyze polarized neutron diffraction data pertaining to acentric crystals. It consists of a two-step process, featuring first an effective flipping ratio and second a linear inverse problem. The latter is solved either by a new generalized inverse Fourier transform or by using maximum entropy. Using metamagnetic UCoAl as a test case, we find the following results: (i) the U and Co(2) moments increase with an applied magnetic field whereas the Co(1) moment remains almost constant, (ii) the U and Co(2) magnetic densities are weakly anisotropic

  7. Model-free adaptive control of supercritical circulating fluidized-bed boilers

    Science.gov (United States)

    Cheng, George Shu-Xing; Mulkey, Steven L

    2014-12-16

    A novel 3-Input-3-Output (3.times.3) Fuel-Air Ratio Model-Free Adaptive (MFA) controller is introduced, which can effectively control key process variables including Bed Temperature, Excess O2, and Furnace Negative Pressure of combustion processes of advanced boilers. A novel 7-input-7-output (7.times.7) MFA control system is also described for controlling a combined 3-Input-3-Output (3.times.3) process of Boiler-Turbine-Generator (BTG) units and a 5.times.5 CFB combustion process of advanced boilers. Those boilers include Circulating Fluidized-Bed (CFB) Boilers and Once-Through Supercritical Circulating Fluidized-Bed (OTSC CFB) Boilers.

  8. Target Tracking Based Scene Analysis

    Science.gov (United States)

    1984-08-01

    NATO Advanced Study PG Institute, Braunlage/ Harz , FRG, June 21 July 2, 1I82 Springer, Berlin, 1983, pp. 493-501. 141 B. Bhanu."Recognition of...Braunlage/ Harz . FRG, June 21 - July 2, 1082 Springer, Berlin, 1083. pp 10.1-124. [81 R.B. Cate, T.*1B. Dennis, J.T. Mallin, K.S. Nedelman, NEIL Trenchard, and...34Image, Sequence Processing and Dynamic Scene Analysis", Proceedings of NATO,. Advanced Study Institute, Braunlage/ Harz , FRG, June 21 - July 2, 1982

  9. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  10. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  11. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  12. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  13. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  14. The PIT-trap-A "model-free" bootstrap procedure for inference about regression models with discrete, multivariate responses.

    Science.gov (United States)

    Warton, David I; Thibaut, Loïc; Wang, Yi Alice

    2017-01-01

    Bootstrap methods are widely used in statistics, and bootstrapping of residuals can be especially useful in the regression context. However, difficulties are encountered extending residual resampling to regression settings where residuals are not identically distributed (thus not amenable to bootstrapping)-common examples including logistic or Poisson regression and generalizations to handle clustered or multivariate data, such as generalised estimating equations. We propose a bootstrap method based on probability integral transform (PIT-) residuals, which we call the PIT-trap, which assumes data come from some marginal distribution F of known parametric form. This method can be understood as a type of "model-free bootstrap", adapted to the problem of discrete and highly multivariate data. PIT-residuals have the key property that they are (asymptotically) pivotal. The PIT-trap thus inherits the key property, not afforded by any other residual resampling approach, that the marginal distribution of data can be preserved under PIT-trapping. This in turn enables the derivation of some standard bootstrap properties, including second-order correctness of pivotal PIT-trap test statistics. In multivariate data, bootstrapping rows of PIT-residuals affords the property that it preserves correlation in data without the need for it to be modelled, a key point of difference as compared to a parametric bootstrap. The proposed method is illustrated on an example involving multivariate abundance data in ecology, and demonstrated via simulation to have improved properties as compared to competing resampling methods.

  15. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  16. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis...

  17. Node-based analysis of species distributions

    DEFF Research Database (Denmark)

    Borregaard, Michael Krabbe; Rahbek, Carsten; Fjeldså, Jon

    2014-01-01

    with case studies on two groups with well-described biogeographical histories: a local-scale community data set of hummingbirds in the North Andes, and a large-scale data set of the distribution of all species of New World flycatchers. The node-based analysis of these two groups generates a set...

  18. Practice-related changes in neural activation patterns investigated via wavelet-based clustering analysis

    Science.gov (United States)

    Lee, Jinae; Park, Cheolwoo; Dyckman, Kara A.; Lazar, Nicole A.; Austin, Benjamin P.; Li, Qingyang; McDowell, Jennifer E.

    2012-01-01

    Objectives To evaluate brain activation using functional magnetic resonance imaging (fMRI) and specifically, activation changes across time associated with practice-related cognitive control during eye movement tasks. Experimental design Participants were engaged in antisaccade performance (generating a glance away from a cue) while fMR images were acquired during two separate time points: 1) at pre-test before any exposure to the task, and 2) at post-test, after one week of daily practice on antisaccades, prosaccades (glancing towards a target) or fixation (maintaining gaze on a target). Principal observations The three practice groups were compared across the two time points, and analyses were conducted via the application of a model-free clustering technique based on wavelet analysis. This series of procedures was developed to avoid analysis problems inherent in fMRI data and was composed of several steps: detrending, data aggregation, wavelet transform and thresholding, no trend test, principal component analysis and K-means clustering. The main clustering algorithm was built in the wavelet domain to account for temporal correlation. We applied a no trend test based on wavelets to significantly reduce the high dimension of the data. We clustered the thresholded wavelet coefficients of the remaining voxels using the principal component analysis K-means clustering. Conclusion Over the series of analyses, we found that the antisaccade practice group was the only group to show decreased activation from pre- to post-test in saccadic circuitry, particularly evident in supplementary eye field, frontal eye fields, superior parietal lobe, and cuneus. PMID:22505290

  19. Examples of model-free implant restorations using Cerec inLab 4.0 software.

    Science.gov (United States)

    Reich, S; Schley, J; Kern, T; Fiedler, K; Wolfart, S

    2012-01-01

    This case report demonstrates two ways to fabricate model-free implant restorations with the Cerec inLab 4.0 software. Because the patient, a woman with a history of periodontal disease, did not wish to have a removable partial denture, implant therapy was planned for the restoration of her edentulous areas 14/15 and 24/25. In addition, the restoration was to provide functional relief of the natural maxillary anterior teeth. The two implants for the first quadrant were planned as single-tooth restorations. Each was designed as a full contour implant supra-structure using the Cerec Biogeneric abutment design technique. After completing the design phase, each restoration proposal was split into two parts: a zirconia abutment and a lithium disilicate crown. For the restoration of the second quadrant, custom 20-degree-angled abutments were individualized and acquired with the Cerec camera. A block crown was then designed, milled in burn-out acrylic resin, and fabricated from a lithium disilicate glass-ceramic ingot according to the press ceramic technique. Additionally methods of provisional restorations are discussed.

  20. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  1. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Nielsen, Mads; Lo, Pechin Chien Pau

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based...... on measured lung function instead of on manually annotated regions of interest (ROIs). A quantitative measure of COPD is obtained by fusing COPD probabilities computed in ROIs within the lung fields where the individual ROI probabilities are computed using a k nearest neighbor (kNN ) classifier. The distance...... and subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density...

  2. Watershed-based Morphometric Analysis: A Review

    Science.gov (United States)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  3. A dictionary based informational genome analysis

    Directory of Open Access Journals (Sweden)

    Castellini Alberto

    2012-09-01

    Full Text Available Abstract Background In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes. Results Any set of words (factors occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters, was discussed, and suggested a method to define synthetic genetic networks. Conclusions We introduced a methodology based on dictionaries, and an efficient motif-finding software application for comparative genomics. This approach could be extended along many investigation lines, namely exported in other contexts of computational genomics, as a basis for discrimination of genomic pathologies.

  4. Energy-based dynamic reliability analysis

    International Nuclear Information System (INIS)

    Fukushima, S.; Mizutani, M.; Akao, Y.; Katukura, H.

    1993-01-01

    Dynamic reliability analysis (DRA) of structures subjected to strong ground motions is one of the most important parts of the safety assessment of structures. So far many DRA methods based on probabilistic theory have been developed. Generally, DRA methods are divided into three phases; seismic hazard analysis, fragility synthesis, and evaluation of probability of failure. In most DRA method, the seismic hazard curves of peak ground acceleration (PGA) or spectral acceleration at a certain period is evaluated to show the occurrence ratio of input motion in the seismic hazard analysis, and, in the fragility synthesis, random vibration theory is adopted to calculate the conditional probability of failure of structures given the occurrence of input motion. Since random vibration theory is developed in the frequency domain, PGAs are transformed into information in the frequency domain by means of introducing so-called peak factors and power spectra. This fact suggests that it is more convenient for DRA researchers to define the seismic hazardous the indices defined in the frequency domain from the first. In this paper, DRAs based on the two indices, i.e. PGA and total energy, are investigated in detail. Through the study, features and issues of DRA based on the total energy (hereafter called EDRA) are clarified from the viewpoint of the responses of the structure. (author)

  5. Videography-Based Unconstrained Video Analysis.

    Science.gov (United States)

    Li, Kang; Li, Sheng; Oh, Sangmin; Fu, Yun

    2017-05-01

    Video analysis and understanding play a central role in visual intelligence. In this paper, we aim to analyze unconstrained videos, by designing features and approaches to represent and analyze videography styles in the videos. Videography denotes the process of making videos. The unconstrained videos are defined as the long duration consumer videos that usually have diverse editing artifacts and significant complexity of contents. We propose to construct a videography dictionary, which can be utilized to represent every video clip as a sequence of videography words. In addition to semantic features, such as foreground object motion and camera motion, we also incorporate two novel interpretable features to characterize videography, including the scale information and the motion correlations. We then demonstrate that, by using statistical analysis methods, the unique videography signatures extracted from different events can be automatically identified. For real-world applications, we explore the use of videography analysis for three types of applications, including content-based video retrieval, video summarization (both visual and textual), and videography-based feature pooling. In the experiments, we evaluate the performance of our approach and other methods on a large-scale unconstrained video dataset, and show that the proposed approach significantly benefits video analysis in various ways.

  6. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  7. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  8. Web-based Analysis Services Report

    CERN Document Server

    AUTHOR|(CDS)2108758; Canali, Luca; Grancher, Eric; Lamanna, Massimo; McCance, Gavin; Mato Vila, Pere; Piparo, Danilo; Moscicki, Jakub; Pace, Alberto; Brito Da Rocha, Ricardo; Simko, Tibor; Smith, Tim; Tejedor Saavedra, Enric; CERN. Geneva. IT Department

    2017-01-01

    Web-based services (cloud services) is an important trend to innovate end-user services while optimising the service operational costs. CERN users are constantly proposing new approaches (inspired from services existing on the web, tools used in education or other science or based on their experience in using existing computing services). In addition, industry and open source communities have recently made available a large number of powerful and attractive tools and platforms that enable large scale data processing. “Big Data” software stacks notably provide solutions for scalable storage, distributed compute and data analysis engines, data streaming, web-based interfaces (notebooks). Some of those platforms and tools, typically available as open source products, are experiencing a very fast adoption in industry and science such that they are becoming “de facto” references in several areas of data engineering, data science and machine learning. In parallel to users' requests, WLCG is considering to c...

  9. Fault-based analysis of flexible ciphers

    Directory of Open Access Journals (Sweden)

    V.I.Korjik

    2002-07-01

    Full Text Available We consider security of some flexible ciphers against differential fault analysis (DFA. We present a description of the fault-based attack on two kinds of the flexible ciphers. The first kind is represented by the fast software-oriented cipher based on data-dependent subkey selection (DDSS, in which flexibility corresponds to the use of key-dependent operations. The second kind is represented by a DES-like cryptosystem GOST with secrete S-boxes. In general, the use of some secrete operations and procedures contributes to the security of the cryptosystem, however degree of this contribution depends significantly on the structure of the encryption mechanism. It is shown how to attack the DDSS-based flexible cipher using DFA though this cipher is secure against standard variants of the differential and linear cryptanalysis. We also give an outline of ciphers RC5 and GOST showing that they are also insecure against DFA-based attack. We suggest also a modification of the DDSS mechanism and a variant of the advanced DDSS-based flexible cipher that is secure against attacks based on random hardware faults.

  10. FINANCIAL ANALYSIS BASED ON THE ANNUAL BALANCE

    Directory of Open Access Journals (Sweden)

    ILIE RĂSCOLEAN

    2015-12-01

    Full Text Available This article demonstrates the fact that an important place in the statistical analysis of the financial management of the company goes to the analysis of the balance of assets, this representing a certain state of the capital in point of its existence, material structure and financial results obtained. It represents as well a consequence of the development of the processes forming the activity object of the company, constituting its financial framework. The balance of assets relies on the analysis liquidities-exigibility and highlights the company’s insolvency risk, namely the firm’s incapacity of honouring its commitments to third parties. The financial analysis aims to research the company’s balance statements and, especially, to measure the financial profitability starting from the balance sheet in the result account and from any other information given by the trading company or which could be obtained based on them. The authors of this article believe that by the financial analysis of the assets one can determine the net patrimony and also the accounting value of the shareholders’ shares and, respectively, one can establish the liquidity and solvency of the company, and the evaluation of the financial performances, detecting eventual situations of financial imbalance that could affect the continuity of the activity.

  11. Risk analysis based on hazards interactions

    Science.gov (United States)

    Rossi, Lauro; Rudari, Roberto; Trasforini, Eva; De Angeli, Silvia; Becker, Joost

    2017-04-01

    Despite an increasing need for open, transparent, and credible multi-hazard risk assessment methods, models, and tools, the availability of comprehensive risk information needed to inform disaster risk reduction is limited, and the level of interaction across hazards is not systematically analysed. Risk assessment methodologies for different hazards often produce risk metrics that are not comparable. Hazard interactions (consecutive occurrence two or more different events) are generally neglected, resulting in strongly underestimated risk assessment in the most exposed areas. This study presents cases of interaction between different hazards, showing how subsidence can affect coastal and river flood risk (Jakarta and Bandung, Indonesia) or how flood risk is modified after a seismic event (Italy). The analysis of well documented real study cases, based on a combination between Earth Observation and in-situ data, would serve as basis the formalisation of a multi-hazard methodology, identifying gaps and research frontiers. Multi-hazard risk analysis is performed through the RASOR platform (Rapid Analysis and Spatialisation Of Risk). A scenario-driven query system allow users to simulate future scenarios based on existing and assumed conditions, to compare with historical scenarios, and to model multi-hazard risk both before and during an event (www.rasor.eu).

  12. NASA Lunar Base Wireless System Propagation Analysis

    Science.gov (United States)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  13. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  14. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  15. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  16. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  17. Online UV-visible spectroscopy and multivariate curve resolution as powerful tool for model-free investigation of laccase-catalysed oxidation.

    Science.gov (United States)

    Kandelbauer, A; Kessler, W; Kessler, R W

    2008-03-01

    The laccase-catalysed transformation of indigo carmine (IC) with and without a redox active mediator was studied using online UV-visible spectroscopy. Deconvolution of the mixture spectra obtained during the reaction was performed on a model-free basis using multivariate curve resolution (MCR). Thereby, the time courses of educts, products, and reaction intermediates involved in the transformation were reconstructed without prior mechanistic assumptions. Furthermore, the spectral signature of a reactive intermediate which could not have been detected by a classical hard-modelling approach was extracted from the chemometric analysis. The findings suggest that the combined use of UV-visible spectroscopy and MCR may lead to unexpectedly deep mechanistic evidence otherwise buried in the experimental data. Thus, although rather an unspecific method, UV-visible spectroscopy can prove useful in the monitoring of chemical reactions when combined with MCR. This offers a wide range of chemists a cheap and readily available, highly sensitive tool for chemical reaction online monitoring.

  18. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  19. Thermal-Signature-Based Sleep Analysis Sensor

    Directory of Open Access Journals (Sweden)

    Ali Seba

    2017-10-01

    Full Text Available This paper addresses the development of a new technique in the sleep analysis domain. Sleep is defined as a periodic physiological state during which vigilance is suspended and reactivity to external stimulations diminished. We sleep on average between six and nine hours per night and our sleep is composed of four to six cycles of about 90 min each. Each of these cycles is composed of a succession of several stages of sleep that vary in depth. Analysis of sleep is usually done via polysomnography. This examination consists of recording, among other things, electrical cerebral activity by electroencephalography (EEG, ocular movements by electrooculography (EOG, and chin muscle tone by electromyography (EMG. Recordings are made mostly in a hospital, more specifically in a service for monitoring the pathologies related to sleep. The readings are then interpreted manually by an expert to generate a hypnogram, a curve showing the succession of sleep stages during the night in 30s epochs. The proposed method is based on the follow-up of the thermal signature that makes it possible to classify the activity into three classes: “awakening,” “calm sleep,” and “restless sleep”. The contribution of this non-invasive method is part of the screening of sleep disorders, to be validated by a more complete analysis of the sleep. The measure provided by this new system, based on temperature monitoring (patient and ambient, aims to be integrated into the tele-medicine platform developed within the framework of the Smart-EEG project by the SYEL–SYstèmes ELectroniques team. Analysis of the data collected during the first surveys carried out with this method showed a correlation between thermal signature and activity during sleep. The advantage of this method lies in its simplicity and the possibility of carrying out measurements of activity during sleep and without direct contact with the patient at home or hospitals.

  20. Experimental data base for containment thermalhydraulic analysis

    International Nuclear Information System (INIS)

    Cheng, X.; Bazin, P.; Cornet, P.; Hittner, D.; Jackson, J.D.; Lopez Jimenez, J.; Naviglio, A.; Oriolo, F.; Petzold, H.

    2001-01-01

    This paper describes the joint research project DABASCO which is supported by the European Community under a cost-shared contract and participated by nine European institutions. The main objective of the project is to provide a generic experimental data base for the development of physical models and correlations for containment thermalhydraulic analysis. The project consists of seven separate-effects experimental programs which deal with new innovative conceptual features, e.g. passive decay heat removal and spray systems. The results of the various stages of the test programs will be assessed by industrial partners in relation to their applicability to reactor conditions

  1. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  2. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  3. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  4. DTI Analysis of Presbycusis Using Voxel-Based Analysis.

    Science.gov (United States)

    Ma, W; Li, M; Gao, F; Zhang, X; Shi, L; Yu, L; Zhao, B; Chen, W; Wang, G; Wang, X

    2016-07-14

    Presbycusis is the most common sensory deficit in the aging population. A recent study reported using a DTI-based tractography technique to identify a lack of integrity in a portion of the auditory pathway in patients with presbycusis. The aim of our study was to investigate the white matter pathology of patients with presbycusis by using a voxel-based analysis that is highly sensitive to local intensity changes in DTI data. Fifteen patients with presbycusis and 14 age- and sex-matched healthy controls were scanned on a 3T scanner. Fractional anisotropy, mean diffusivity, axial diffusivity, and radial diffusivity were obtained from the DTI data. Intergroup statistics were implemented on these measurements, which were transformed to Montreal Neurological Institute coordinates by using a nonrigid image registration method called large deformation diffeomorphic metric mapping. Increased axial diffusivity, radial diffusivity, and mean diffusivity and decreased fractional anisotropy were found near the right-side hearing-related areas in patients with presbycusis. Increased radial diffusivity and mean diffusivity were also found near a language-related area (Broca area) in patients with presbycusis. Our findings could be important for exploring reliable imaging evidence of presbycusis and could complement an ROI-based approach. © 2016 American Society of Neuroradiology.

  5. Curvelet based offline analysis of SEM images.

    Directory of Open Access Journals (Sweden)

    Syed Hamad Shirazi

    Full Text Available Manual offline analysis, of a scanning electron microscopy (SEM image, is a time consuming process and requires continuous human intervention and efforts. This paper presents an image processing based method for automated offline analyses of SEM images. To this end, our strategy relies on a two-stage process, viz. texture analysis and quantification. The method involves a preprocessing step, aimed at the noise removal, in order to avoid false edges. For texture analysis, the proposed method employs a state of the art Curvelet transform followed by segmentation through a combination of entropy filtering, thresholding and mathematical morphology (MM. The quantification is carried out by the application of a box-counting algorithm, for fractal dimension (FD calculations, with the ultimate goal of measuring the parameters, like surface area and perimeter. The perimeter is estimated indirectly by counting the boundary boxes of the filled shapes. The proposed method, when applied to a representative set of SEM images, not only showed better results in image segmentation but also exhibited a good accuracy in the calculation of surface area and perimeter. The proposed method outperforms the well-known Watershed segmentation algorithm.

  6. Cluster-based exposure variation analysis

    Science.gov (United States)

    2013-01-01

    Background Static posture, repetitive movements and lack of physical variation are known risk factors for work-related musculoskeletal disorders, and thus needs to be properly assessed in occupational studies. The aims of this study were (i) to investigate the effectiveness of a conventional exposure variation analysis (EVA) in discriminating exposure time lines and (ii) to compare it with a new cluster-based method for analysis of exposure variation. Methods For this purpose, we simulated a repeated cyclic exposure varying within each cycle between “low” and “high” exposure levels in a “near” or “far” range, and with “low” or “high” velocities (exposure change rates). The duration of each cycle was also manipulated by selecting a “small” or “large” standard deviation of the cycle time. Theses parameters reflected three dimensions of exposure variation, i.e. range, frequency and temporal similarity. Each simulation trace included two realizations of 100 concatenated cycles with either low (ρ = 0.1), medium (ρ = 0.5) or high (ρ = 0.9) correlation between the realizations. These traces were analyzed by conventional EVA, and a novel cluster-based EVA (C-EVA). Principal component analysis (PCA) was applied on the marginal distributions of 1) the EVA of each of the realizations (univariate approach), 2) a combination of the EVA of both realizations (multivariate approach) and 3) C-EVA. The least number of principal components describing more than 90% of variability in each case was selected and the projection of marginal distributions along the selected principal component was calculated. A linear classifier was then applied to these projections to discriminate between the simulated exposure patterns, and the accuracy of classified realizations was determined. Results C-EVA classified exposures more correctly than univariate and multivariate EVA approaches; classification accuracy was 49%, 47% and 52% for EVA (univariate

  7. Voxel-Based LIDAR Analysis and Applications

    Science.gov (United States)

    Hagstrom, Shea T.

    One of the greatest recent changes in the field of remote sensing is the addition of high-quality Light Detection and Ranging (LIDAR) instruments. In particular, the past few decades have been greatly beneficial to these systems because of increases in data collection speed and accuracy, as well as a reduction in the costs of components. These improvements allow modern airborne instruments to resolve sub-meter details, making them ideal for a wide variety of applications. Because LIDAR uses active illumination to capture 3D information, its output is fundamentally different from other modalities. Despite this difference, LIDAR datasets are often processed using methods appropriate for 2D images and that do not take advantage of its primary virtue of 3-dimensional data. It is this problem we explore by using volumetric voxel modeling. Voxel-based analysis has been used in many applications, especially medical imaging, but rarely in traditional remote sensing. In part this is because the memory requirements are substantial when handling large areas, but with modern computing and storage this is no longer a significant impediment. Our reason for using voxels to model scenes from LIDAR data is that there are several advantages over standard triangle-based models, including better handling of overlapping surfaces and complex shapes. We show how incorporating system position information from early in the LIDAR point cloud generation process allows radiometrically-correct transmission and other novel voxel properties to be recovered. This voxelization technique is validated on simulated data using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) software, a first-principles based ray-tracer developed at the Rochester Institute of Technology. Voxel-based modeling of LIDAR can be useful on its own, but we believe its primary advantage is when applied to problems where simpler surface-based 3D models conflict with the requirement of realistic geometry. To

  8. Joint multifractal analysis based on wavelet leaders

    Science.gov (United States)

    Jiang, Zhi-Qiang; Yang, Yan-Hong; Wang, Gang-Jin; Zhou, Wei-Xing

    2017-12-01

    Mutually interacting components form complex systems and these components usually have long-range cross-correlated outputs. Using wavelet leaders, we propose a method for characterizing the joint multifractal nature of these long-range cross correlations; we call this method joint multifractal analysis based on wavelet leaders (MF-X-WL). We test the validity of the MF-X-WL method by performing extensive numerical experiments on dual binomial measures with multifractal cross correlations and bivariate fractional Brownian motions (bFBMs) with monofractal cross correlations. Both experiments indicate that MF-X-WL is capable of detecting cross correlations in synthetic data with acceptable estimating errors. We also apply the MF-X-WL method to pairs of series from financial markets (returns and volatilities) and online worlds (online numbers of different genders and different societies) and determine intriguing joint multifractal behavior.

  9. Bismuth-based electrochemical stripping analysis

    Science.gov (United States)

    Wang, Joseph

    2004-01-27

    Method and apparatus for trace metal detection and analysis using bismuth-coated electrodes and electrochemical stripping analysis. Both anodic stripping voltammetry and adsorptive stripping analysis may be employed.

  10. Operating cost analysis of anaesthesia: Activity based costing (ABC analysis

    Directory of Open Access Journals (Sweden)

    Majstorović Branislava M.

    2011-01-01

    Full Text Available Introduction. Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. Objective. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC analysis. Methods. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment. of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, “each cost object (service or unit” of the Republican Health-care Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Health-care Insurance. Results. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. Conclusion. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  11. [Operating cost analysis of anaesthesia: activity based costing (ABC analysis)].

    Science.gov (United States)

    Majstorović, Branislava M; Kastratović, Dragana A; Vučović, Dragan S; Milaković, Branko D; Miličić, Biljana R

    2011-01-01

    Cost of anaesthesiology represent defined measures to determine a precise profile of expenditure estimation of surgical treatment, which is important regarding planning of healthcare activities, prices and budget. In order to determine the actual value of anaestesiological services, we started with the analysis of activity based costing (ABC) analysis. Retrospectively, in 2005 and 2006, we estimated the direct costs of anestesiological services (salaries, drugs, supplying materials and other: analyses and equipment.) of the Institute of Anaesthesia and Resuscitation of the Clinical Centre of Serbia. The group included all anesthetized patients of both sexes and all ages. We compared direct costs with direct expenditure, "each cost object (service or unit)" of the Republican Healthcare Insurance. The Summary data of the Departments of Anaesthesia documented in the database of the Clinical Centre of Serbia. Numerical data were utilized and the numerical data were estimated and analyzed by computer programs Microsoft Office Excel 2003 and SPSS for Windows. We compared using the linear model of direct costs and unit costs of anaesthesiological services from the Costs List of the Republican Healthcare Insurance. Direct costs showed 40% of costs were spent on salaries, (32% on drugs and supplies, and 28% on other costs, such as analyses and equipment. The correlation of the direct costs of anaestesiological services showed a linear correlation with the unit costs of the Republican Healthcare Insurance. During surgery, costs of anaesthesia would increase by 10% the surgical treatment cost of patients. Regarding the actual costs of drugs and supplies, we do not see any possibility of costs reduction. Fixed elements of direct costs provide the possibility of rationalization of resources in anaesthesia.

  12. OHBM 2017: Practical intensity based meta-analysis

    OpenAIRE

    Maumet, Camille

    2017-01-01

    "Practical intensity-based meta-analysis" slides from my talk in the OHBM 2017 educational talk on Neuroimaging meta-analysis.http://www.humanbrainmapping.org/files/2017/ED Courses/Neuroimaging Meta-Analysis.pdf

  13. Communication Base Station Log Analysis Based on Hierarchical Clustering

    Directory of Open Access Journals (Sweden)

    Zhang Shao-Hua

    2017-01-01

    Full Text Available Communication base stations generate massive data every day, these base station logs play an important value in mining of the business circles. This paper use data mining technology and hierarchical clustering algorithm to group the scope of business circle for the base station by recording the data of these base stations.Through analyzing the data of different business circle based on feature extraction and comparing different business circle category characteristics, which can choose a suitable area for operators of commercial marketing.

  14. Scope-Based Method Cache Analysis

    DEFF Research Database (Denmark)

    Huber, Benedikt; Hepp, Stefan; Schoeberl, Martin

    2014-01-01

    , as it requests memory transfers at well-defined instructions only. In this article, we present a new cache analysis framework that generalizes and improves work on cache persistence analysis. The analysis demonstrates that a global view on the cache behavior permits the precise analyses of caches which are hard......The quest for time-predictable systems has led to the exploration of new hardware architectures that simplify analysis and reasoning in the temporal domain, while still providing competitive performance. For the instruction memory, the method cache is a conceptually attractive solution...

  15. Canonical analysis based on mutual information

    DEFF Research Database (Denmark)

    Nielsen, Allan Aasbjerg; Vestergaard, Jacob Schack

    2015-01-01

    combinations with the information theoretical measure mutual information (MI). We term this type of analysis canonical information analysis (CIA). MI allows for the actual joint distribution of the variables involved and not just second order statistics. While CCA is ideal for Gaussian data, CIA facilitates...

  16. Using Willie's Acid-Base Box for Blood Gas Analysis

    Science.gov (United States)

    Dietz, John R.

    2011-01-01

    In this article, the author describes a method developed by Dr. William T. Lipscomb for teaching blood gas analysis of acid-base status and provides three examples using Willie's acid-base box. Willie's acid-base box is constructed using three of the parameters of standard arterial blood gas analysis: (1) pH; (2) bicarbonate; and (3) CO[subscript…

  17. Dynamic Chest Image Analysis: Evaluation of Model-Based Pulmonary Perfusion Analysis With Pyramid Images

    National Research Council Canada - National Science Library

    Liang, Jianming

    2001-01-01

    Dynamic Chest Image Analysis aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence...

  18. Web Based Distributed Coastal Image Analysis System, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project develops Web based distributed image analysis system processing the Moderate Resolution Imaging Spectroradiometer (MODIS) data to provide decision...

  19. Description-based and experience-based decisions: individual analysis

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2012-05-01

    Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.

  20. M-X Basing Area Analysis Process

    National Research Council Canada - National Science Library

    1980-01-01

    ... or unreasonable basing areas for the M-X missile in multiple protective structures (MPS). The process began in January 1977 with criteria involving geotechnical, cultural, safety and other concerns...

  1. Environmentally based Cost-Benefit Analysis

    International Nuclear Information System (INIS)

    Magnell, M.

    1993-11-01

    The fundamentals of the basic elements of a new comprehensive economic assessment, MILA, developed in Sweden with inspiration from the Total Cost Assessment-model are presented. The core of the MILA approach is an expanded cost and benefit inventory. But MILA also includes a complementary addition of an internal waste stream analysis, a tool for evaluation of environmental conflicts in monetary terms, an extended time horizon and direct allocation of costs and revenues to products and processes. However, MILA does not ensure profitability for environmentally sound projects. Essentially, MILA is an approach of refining investment and profitability analysis of a project, investment or product. 109 refs., 38 figs

  2. A Model-Free Diagnosis Approach for Intake Leakage Detection and Characterization in Diesel Engines

    Directory of Open Access Journals (Sweden)

    Ghaleb Hoblos

    2015-07-01

    Full Text Available Feature selection is an essential step for data classification used in fault detection and diagnosis processes. In this work, a new approach is proposed, which combines a feature selection algorithm and a neural network tool for leak detection and characterization tasks in diesel engine air paths. The Chi square classifier is used as the feature selection algorithm and the neural network based on Levenberg-Marquardt is used in system behavior modeling. The obtained neural network is used for leak detection and characterization. The model is learned and validated using data generated by xMOD. This tool is used again for testing. The effectiveness of the proposed approach is illustrated in simulation when the system operates on a low speed/load and the considered leak affecting the air path is very small.

  3. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    It is shown that knowledge of epochs can be used for analysis of speech for glottal activity detection, extrac- ..... The performance of the proposed GAD method was evaluated using the detection error trade- off (DET) curves ..... the oral closure is released more or less impulsively as the vocal tract moves towards a configu-.

  4. ANALYSIS OF A DATA BASE OF CHALCOPYRITE ...

    African Journals Online (AJOL)

    Université Aboubekr Belkaid de Tlemcen, Faculté des sciences, Département de physique, Unité de Recherche sur les Matériaux Energies Renouvelables, B.P 119,. 13000 Tlemcen Algerie. Received: 15 November 2011 / Accepted: 27 December 2011 / Published online: 31 December 2011. ABSTRACT. The analysis of ...

  5. Competence-based employability: A Rasch analysis

    NARCIS (Netherlands)

    Fröhlich, D.; Liu, M.; van der Heijden, B.

    2015-01-01

    Due to trends such as increasingly inter-organizational careers, the concept of employability is high on the agenda of employees, employers, and pol icy makers. To improve our understanding of this concept, we need to understand thoroughly its measurement. We performed a Rasch analysis of Van der

  6. Epoch-based analysis of speech signals

    Indian Academy of Sciences (India)

    Epoch sequence is useful to manipulate prosody in speech synthesis applications. Accurate estimation of epochs helps in characterizing voice quality features. Epoch extraction also helps in speech enhancement and multispeaker separation. In this tutorial article, the importance of epochs for speech analysis is discussed, ...

  7. Model Based Analysis of Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Han, Tingting; Kammueller, Florian

    2016-01-01

    In order to detect malicious insider attacks it is important to model and analyse infrastructures and policies of organisations and the insiders acting within them. We extend formal approaches that allow modelling such scenarios by quantitative aspects to enable a precise analysis of security...

  8. Prostate cancer detection from model-free T1-weighted time series and diffusion imaging

    Science.gov (United States)

    Haq, Nandinee F.; Kozlowski, Piotr; Jones, Edward C.; Chang, Silvia D.; Goldenberg, S. Larry; Moradi, Mehdi

    2015-03-01

    The combination of Dynamic Contrast Enhanced (DCE) images with diffusion MRI has shown great potential in prostate cancer detection. The parameterization of DCE images to generate cancer markers is traditionally performed based on pharmacokinetic modeling. However, pharmacokinetic models make simplistic assumptions about the tissue perfusion process, require the knowledge of contrast agent concentration in a major artery, and the modeling process is sensitive to noise and fitting instabilities. We address this issue by extracting features directly from the DCE T1-weighted time course without modeling. In this work, we employed a set of data-driven features generated by mapping the DCE T1 time course to its principal component space, along with diffusion MRI features to detect prostate cancer. The optimal set of DCE features is extracted with sparse regularized regression through a Least Absolute Shrinkage and Selection Operator (LASSO) model. We show that when our proposed features are used within the multiparametric MRI protocol to replace the pharmacokinetic parameters, the area under ROC curve is 0.91 for peripheral zone classification and 0.87 for whole gland classification. We were able to correctly classify 32 out of 35 peripheral tumor areas identified in the data when the proposed features were used with support vector machine classification. The proposed feature set was used to generate cancer likelihood maps for the prostate gland.

  9. Model-Free Machine Learning in Biomedicine: Feasibility Study in Type 1 Diabetes.

    Directory of Open Access Journals (Sweden)

    Elena Daskalaki

    Full Text Available Although reinforcement learning (RL is suitable for highly uncertain systems, the applicability of this class of algorithms to medical treatment may be limited by the patient variability which dictates individualised tuning for their usually multiple algorithmic parameters. This study explores the feasibility of RL in the framework of artificial pancreas development for type 1 diabetes (T1D. In this approach, an Actor-Critic (AC learning algorithm is designed and developed for the optimisation of insulin infusion for personalised glucose regulation. AC optimises the daily basal insulin rate and insulin:carbohydrate ratio for each patient, on the basis of his/her measured glucose profile. Automatic, personalised tuning of AC is based on the estimation of information transfer (IT from insulin to glucose signals. Insulin-to-glucose IT is linked to patient-specific characteristics related to total daily insulin needs and insulin sensitivity (SI. The AC algorithm is evaluated using an FDA-accepted T1D simulator on a large patient database under a complex meal protocol, meal uncertainty and diurnal SI variation. The results showed that 95.66% of time was spent in normoglycaemia in the presence of meal uncertainty and 93.02% when meal uncertainty and SI variation were simultaneously considered. The time spent in hypoglycaemia was 0.27% in both cases. The novel tuning method reduced the risk of severe hypoglycaemia, especially in patients with low SI.

  10. Design of a completely model free adaptive control in the presence of parametric, non-parametric uncertainties and random control signal delay.

    Science.gov (United States)

    Tutsoy, Onder; Barkana, Duygun Erol; Tugal, Harun

    2018-03-14

    In this paper, an adaptive controller is developed for discrete time linear systems that takes into account parametric uncertainty, internal-external non-parametric random uncertainties, and time varying control signal delay. Additionally, the proposed adaptive control is designed in such a way that it is utterly model free. Even though these properties are studied separately in the literature, they are not taken into account all together in adaptive control literature. The Q-function is used to estimate long-term performance of the proposed adaptive controller. Control policy is generated based on the long-term predicted value, and this policy searches an optimal stabilizing control signal for uncertain and unstable systems. The derived control law does not require an initial stabilizing control assumption as in the ones in the recent literature. Learning error, control signal convergence, minimized Q-function, and instantaneous reward are analyzed to demonstrate the stability and effectiveness of the proposed adaptive controller in a simulation environment. Finally, key insights on parameters convergence of the learning and control signals are provided. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Movement Pattern Analysis Based on Sequence Signatures

    Directory of Open Access Journals (Sweden)

    Seyed Hossein Chavoshi

    2015-09-01

    Full Text Available Increased affordability and deployment of advanced tracking technologies have led researchers from various domains to analyze the resulting spatio-temporal movement data sets for the purpose of knowledge discovery. Two different approaches can be considered in the analysis of moving objects: quantitative analysis and qualitative analysis. This research focuses on the latter and uses the qualitative trajectory calculus (QTC, a type of calculus that represents qualitative data on moving point objects (MPOs, and establishes a framework to analyze the relative movement of multiple MPOs. A visualization technique called sequence signature (SESI is used, which enables to map QTC patterns in a 2D indexed rasterized space in order to evaluate the similarity of relative movement patterns of multiple MPOs. The applicability of the proposed methodology is illustrated by means of two practical examples of interacting MPOs: cars on a highway and body parts of a samba dancer. The results show that the proposed method can be effectively used to analyze interactions of multiple MPOs in different domains.

  12. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    This paper presents a goal based methodology for HAZOP studies in which a functional model of the plant is used to assist in a functional decomposition of the plant starting from the purpose of the plant and continuing down to the function of a single node, e.g. a pipe section. This approach lead...

  13. Automated Video-Based Traffic Count Analysis.

    Science.gov (United States)

    2016-01-01

    The goal of this effort has been to develop techniques that could be applied to the : detection and tracking of vehicles in overhead footage of intersections. To that end we : have developed and published techniques for vehicle tracking based on dete...

  14. Model-free aftershock forecasts constructed from similar sequences in the past

    Science.gov (United States)

    van der Elst, N.; Page, M. T.

    2017-12-01

    The basic premise behind aftershock forecasting is that sequences in the future will be similar to those in the past. Forecast models typically use empirically tuned parametric distributions to approximate past sequences, and project those distributions into the future to make a forecast. While parametric models do a good job of describing average outcomes, they are not explicitly designed to capture the full range of variability between sequences, and can suffer from over-tuning of the parameters. In particular, parametric forecasts may produce a high rate of "surprises" - sequences that land outside the forecast range. Here we present a non-parametric forecast method that cuts out the parametric "middleman" between training data and forecast. The method is based on finding past sequences that are similar to the target sequence, and evaluating their outcomes. We quantify similarity as the Poisson probability that the observed event count in a past sequence reflects the same underlying intensity as the observed event count in the target sequence. Event counts are defined in terms of differential magnitude relative to the mainshock. The forecast is then constructed from the distribution of past sequences outcomes, weighted by their similarity. We compare the similarity forecast with the Reasenberg and Jones (RJ95) method, for a set of 2807 global aftershock sequences of M≥6 mainshocks. We implement a sequence-specific RJ95 forecast using a global average prior and Bayesian updating, but do not propagate epistemic uncertainty. The RJ95 forecast is somewhat more precise than the similarity forecast: 90% of observed sequences fall within a factor of two of the median RJ95 forecast value, whereas the fraction is 85% for the similarity forecast. However, the surprise rate is much higher for the RJ95 forecast; 10% of observed sequences fall in the upper 2.5% of the (Poissonian) forecast range. The surprise rate is less than 3% for the similarity forecast. The similarity

  15. Analysis of Financial Position Based on the Balance Sheet

    OpenAIRE

    Spineanu-Georgescu Luciana

    2011-01-01

    Analysis of financial position based on the balance sheet is mainly aimed at assessing the extent to which financial structure chosen by the firm, namely, financial resources, covering the needs reflected in the balance sheet financed. This is done through an analysis known as horizontal analysis balance sheet financial imbalances.

  16. Desiccant-Based Preconditioning Market Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    A number of important conclusions can be drawn as a result of this broad, first-phase market evaluation. The more important conclusions include the following: (1) A very significant market opportunity will exist for specialized outdoor air-handling units (SOAHUs) as more construction and renovation projects are designed to incorporate the recommendations made by the ASHRAE 62-1989 standard. Based on this investigation, the total potential market is currently $725,000,000 annually (see Table 6, Sect. 3). Based on the market evaluations completed, it is estimated that approximately $398,000,000 (55%) of this total market could be served by DBC systems if they were made cost-effective through mass production. Approximately $306,000,000 (42%) of the total can be served by a non-regenerated, desiccant-based total recovery approach, based on the information provided by this investigation. Approximately $92,000,000 (13%) can be served by a regenerated desiccant-based cooling approach (see Table 7, Sect. 3). (2) A projection of the market selling price of various desiccant-based SOAHU systems was prepared using prices provided by Trane for central-station, air-handling modules currently manufactured. The wheel-component pricing was added to these components by SEMCO. This resulted in projected pricing for these systems that is significantly less than that currently offered by custom suppliers (see Table 4, Sect. 2). Estimated payback periods for all SOAHU approaches were quite short when compared with conventional over-cooling and reheat systems. Actual paybacks may vary significantly depending on site-specific considerations. (3) In comparing cost vs benefit of each SOAHU approach, it is critical that the total system design be evaluated. For example, the cost premium of a DBC system is very significant when compared to a conventional air handling system, yet the reduced chiller, boiler, cooling tower, and other expense often equals or exceeds this premium, resulting in a

  17. Knowledge-based analysis of phenotypes

    KAUST Repository

    Hoendorf, Robert

    2016-01-27

    Phenotypes are the observable characteristics of an organism, and they are widely recorded in biology and medicine. To facilitate data integration, ontologies that formally describe phenotypes are being developed in several domains. I will describe a formal framework to describe phenotypes. A formalized theory of phenotypes is not only useful for domain analysis, but can also be applied to assist in the diagnosis of rare genetic diseases, and I will show how our results on the ontology of phenotypes is now applied in biomedical research.

  18. Discretization analysis of bifurcation based nonlinear amplifiers

    Science.gov (United States)

    Feldkord, Sven; Reit, Marco; Mathis, Wolfgang

    2017-09-01

    Recently, for modeling biological amplification processes, nonlinear amplifiers based on the supercritical Andronov-Hopf bifurcation have been widely analyzed analytically. For technical realizations, digital systems have become the most relevant systems in signal processing applications. The underlying continuous-time systems are transferred to the discrete-time domain using numerical integration methods. Within this contribution, effects on the qualitative behavior of the Andronov-Hopf bifurcation based systems concerning numerical integration methods are analyzed. It is shown exemplarily that explicit Runge-Kutta methods transform the truncated normalform equation of the Andronov-Hopf bifurcation into the normalform equation of the Neimark-Sacker bifurcation. Dependent on the order of the integration method, higher order terms are added during this transformation.A rescaled normalform equation of the Neimark-Sacker bifurcation is introduced that allows a parametric design of a discrete-time system which corresponds to the rescaled Andronov-Hopf system. This system approximates the characteristics of the rescaled Hopf-type amplifier for a large range of parameters. The natural frequency and the peak amplitude are preserved for every set of parameters. The Neimark-Sacker bifurcation based systems avoid large computational effort that would be caused by applying higher order integration methods to the continuous-time normalform equations.

  19. Discretization analysis of bifurcation based nonlinear amplifiers

    Directory of Open Access Journals (Sweden)

    S. Feldkord

    2017-09-01

    Full Text Available Recently, for modeling biological amplification processes, nonlinear amplifiers based on the supercritical Andronov–Hopf bifurcation have been widely analyzed analytically. For technical realizations, digital systems have become the most relevant systems in signal processing applications. The underlying continuous-time systems are transferred to the discrete-time domain using numerical integration methods. Within this contribution, effects on the qualitative behavior of the Andronov–Hopf bifurcation based systems concerning numerical integration methods are analyzed. It is shown exemplarily that explicit Runge–Kutta methods transform the truncated normalform equation of the Andronov–Hopf bifurcation into the normalform equation of the Neimark–Sacker bifurcation. Dependent on the order of the integration method, higher order terms are added during this transformation.A rescaled normalform equation of the Neimark–Sacker bifurcation is introduced that allows a parametric design of a discrete-time system which corresponds to the rescaled Andronov–Hopf system. This system approximates the characteristics of the rescaled Hopf-type amplifier for a large range of parameters. The natural frequency and the peak amplitude are preserved for every set of parameters. The Neimark–Sacker bifurcation based systems avoid large computational effort that would be caused by applying higher order integration methods to the continuous-time normalform equations.

  20. Pathway-based analysis tools for complex diseases: a review.

    Science.gov (United States)

    Jin, Lv; Zuo, Xiao-Yu; Su, Wei-Yang; Zhao, Xiao-Lei; Yuan, Man-Qiong; Han, Li-Zhen; Zhao, Xiang; Chen, Ye-Da; Rao, Shao-Qi

    2014-10-01

    Genetic studies are traditionally based on single-gene analysis. The use of these analyses can pose tremendous challenges for elucidating complicated genetic interplays involved in complex human diseases. Modern pathway-based analysis provides a technique, which allows a comprehensive understanding of the molecular mechanisms underlying complex diseases. Extensive studies utilizing the methods and applications for pathway-based analysis have significantly advanced our capacity to explore large-scale omics data, which has rapidly accumulated in biomedical fields. This article is a comprehensive review of the pathway-based analysis methods-the powerful methods with the potential to uncover the biological depths of the complex diseases. The general concepts and procedures for the pathway-based analysis methods are introduced and then, a comprehensive review of the major approaches for this analysis is presented. In addition, a list of available pathway-based analysis software and databases is provided. Finally, future directions and challenges for the methodological development and applications of pathway-based analysis techniques are discussed. This review will provide a useful guide to dissect complex diseases. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  1. Physical-Mechanisms Based Reliability Analysis For Emerging Technologies

    Science.gov (United States)

    2017-05-05

    AFRL-AFOSR-VA-TR-2017-0095 PHYSICAL -MECHANISMS BASED RELIABILITY ANALYSIS FOR EMERGING TECHNOLOGIES Ron Schrimpf VANDERBILT UNIVERSITY 110 21ST...SUBTITLE PHYSICAL -MECHANISMS BASED RELIABILITY ANALYSIS FOR EMERGING TECHNOLOGIES 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-11-1-0307 5c.  PROGRAM...which reliability models can be built. Thus, it is important to develop more predictive reliability models for advanced technologies, based on physical

  2. Spatial analysis and modelling based on activities

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2010-01-01

    Full Text Available of the year where visibility on the road near sunrise and sunset would be severely impaired, possibly contributing to accidents. One of the advantages of a visual based simulation is that non-technical role players can visualise problems and it enables them... to work with professional, scientific and technical designers. The vector graphic part of the simulation environment can be conveniently built by means of standard CAD programs such as AutoCAD, AutoDesk Revit, 3D Studio Max or MicroGDS. Although...

  3. Sandia National Laboratories analysis code data base

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, C.W.

    1994-11-01

    Sandia National Laboratories, mission is to solve important problems in the areas of national defense, energy security, environmental integrity, and industrial technology. The Laboratories` strategy for accomplishing this mission is to conduct research to provide an understanding of the important physical phenomena underlying any problem, and then to construct validated computational models of the phenomena which can be used as tools to solve the problem. In the course of implementing this strategy, Sandia`s technical staff has produced a wide variety of numerical problem-solving tools which they use regularly in the design, analysis, performance prediction, and optimization of Sandia components, systems and manufacturing processes. This report provides the relevant technical and accessibility data on the numerical codes used at Sandia, including information on the technical competency or capability area that each code addresses, code ``ownership`` and release status, and references describing the physical models and numerical implementation.

  4. Network Anomaly Detection Based on Wavelet Analysis

    Directory of Open Access Journals (Sweden)

    Ali A. Ghorbani

    2008-11-01

    Full Text Available Signal processing techniques have been applied recently for analyzing and detecting network anomalies due to their potential to find novel or unknown intrusions. In this paper, we propose a new network signal modelling technique for detecting network anomalies, combining the wavelet approximation and system identification theory. In order to characterize network traffic behaviors, we present fifteen features and use them as the input signals in our system. We then evaluate our approach with the 1999 DARPA intrusion detection dataset and conduct a comprehensive analysis of the intrusions in the dataset. Evaluation results show that the approach achieves high-detection rates in terms of both attack instances and attack types. Furthermore, we conduct a full day's evaluation in a real large-scale WiFi ISP network where five attack types are successfully detected from over 30 millions flows.

  5. Network reliability analysis based on percolation theory

    International Nuclear Information System (INIS)

    Li, Daqing; Zhang, Qiong; Zio, Enrico; Havlin, Shlomo; Kang, Rui

    2015-01-01

    In this paper, we propose a new way of looking at the reliability of a network using percolation theory. In this new view, a network failure can be regarded as a percolation process and the critical threshold of percolation can be used as network failure criterion linked to the operational settings under control. To demonstrate our approach, we consider both random network models and real networks with different nodes and/or edges lifetime distributions. We study numerically and theoretically the network reliability and find that the network reliability can be solved as a voting system with threshold given by percolation theory. Then we find that the average lifetime of random network increases linearly with the average lifetime of its nodes with uniform life distributions. Furthermore, the average lifetime of the network becomes saturated when system size is increased. Finally, we demonstrate our method on the transmission network system of IEEE 14 bus. - Highlights: • Based on percolation theory, we address questions of practical interest such as “how many failed nodes/edges will break down the whole network?” • The percolation threshold naturally gives a network failure criterion. • The approach based on percolation theory is suited for calculations of large-scale networks

  6. The Route Analysis Based On Flight Plan

    Science.gov (United States)

    Feriyanto, Nur; Saleh, Chairul; Fauzi, Achmad; Rachman Dzakiyullah, Nur; Riza Iwaputra, Kahfi

    2016-02-01

    Economic development effects use of air transportation since the business process in every aspect was increased. Many people these days was prefer using airplane because it can save time and money. This situation also effects flight routes, many airlines offer new routes to deal with competition. Managing flight routes is one of the problems that must be faced in order to find the efficient and effective routes. This paper investigates the best routes based on flight performance by determining the amount of block fuel for the Jakarta-Denpasar flight route. Moreover, in this work compares a two kinds of aircraft and tracks by calculating flight distance, flight time and block fuel. The result shows Jakarta-Denpasar in the Track II has effective and efficient block fuel that can be performed by Airbus 320-200 aircraft. This study can contribute to practice in making an effective decision, especially helping executive management of company due to selecting appropriate aircraft and the track in the flight plan based on the block fuel consumption for business operation.

  7. Model-Based Analysis of Hand Radiographs

    Science.gov (United States)

    Levitt, Tod S.; Hedgcock, Marcus W.

    1989-05-01

    As a step toward computer assisted imagery interpretation, we are developing algorithms for computed radiography that allow a computer to recognize specific bones and joints, and to identify variations from normal in size, shape and density. In this paper we report on our approach to model-based computer recognition of hands in radiographs. First, image processing hypotheses of the imaged bones. Multiple hypotheses of the size and orientation of the imaged anatomy are matched against stored 3D models fof the relevant bones, obtained from statistically valid populations studies. Probabilities of the hypotheses are accrued using Bayesian inference techniques whose evaluation is guided by the structure of the hand model and the observed image-derived evidence such as anti-parallel edges, local contrast, etc. High probability matches between the hand model and the image data can cue additional image processing-based ssearch for bones, joints and soft-tissue to confirm hypotheses of the location of the imaged hand. At this point multipule disease detection techniques, automated bone age identification, etc. can be employed.

  8. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  9. Network analysis based on large deviation statistics

    Science.gov (United States)

    Miyazaki, Syuji

    2007-07-01

    A chaotic piecewise linear map whose statistical properties are identical to those of a random walk on directed graphs such as the world wide web (WWW) is constructed, and the dynamic quantity is analyzed in the framework of large deviation statistics. Gibbs measures include the weight factor appearing in the weighted average of the dynamic quantity, which can also quantitatively measure the importance of web sites. Currently used levels of importance in the commercial search engines are independent of search terms, which correspond to the stationary visiting frequency of each node obtained from a random walk on the network or equivalent chaotic dynamics. Levels of importance based on the Gibbs measure depend on each search term which is specified by the searcher. The topological conjugate transformation between one dynamical system with a Gibbs measure and another dynamical system whose standard invariant probability measure is identical to the Gibbs measure is also discussed.

  10. Network-based analysis of complex diseases.

    Science.gov (United States)

    Liu, Z-P; Wang, Y; Zhang, X-S; Chen, L

    2012-02-01

    Complex diseases are commonly believed to be caused by the breakdown of several correlated genes rather than individual genes. The availability of genome-wide data of high-throughput experiments provides us with new opportunity to explore this hypothesis by analysing the disease-related biomolecular networks, which are expected to bridge genotypes and disease phenotypes and further reveal the biological mechanisms of complex diseases. In this study, the authors review the existing network biology efforts to study complex diseases, such as breast cancer, diabetes and Alzheimer's disease, using high-throughput data and computational tools. Specifically, the authors categorise these existing methods into several classes based on the research topics, that is, disease genes, dysfunctional pathways, network signatures and drug-target networks. The authors also summarise the pros and cons of those methods from both computation and application perspectives, and further discuss research trends and future topics of this promising field.

  11. Radiometer Design Analysis Based Upon Measurement Uncertainty

    Science.gov (United States)

    Racette, Paul E.; Lang, Roger H.

    2004-01-01

    This paper introduces a method for predicting the performance of a radiometer design based on calculating the measurement uncertainty. The variety in radiometer designs and the demand for improved radiometric measurements justify the need for a more general and comprehensive method to assess system performance. Radiometric resolution, or sensitivity, is a figure of merit that has been commonly used to characterize the performance of a radiometer. However when evaluating the performance of a calibration design for a radiometer, the use of radiometric resolution has limited application. These limitations are overcome by considering instead the measurement uncertainty. A method for calculating measurement uncertainty for a generic radiometer design including its calibration algorithm is presented. The result is a generalized technique by which system calibration architectures and design parameters can be studied to optimize instrument performance for given requirements and constraints. Example applications demonstrate the utility of using measurement uncertainty as a figure of merit.

  12. Analysis of Vehicle-Based Security Operations

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Jason M [ORNL; Paul, Nate R [ORNL

    2015-01-01

    Vehicle-to-vehicle (V2V) communications promises to increase roadway safety by providing each vehicle with 360 degree situational awareness of other vehicles in proximity, and by complementing onboard sensors such as radar or camera in detecting imminent crash scenarios. In the United States, approximately three hundred million automobiles could participate in a fully deployed V2V system if Dedicated Short-Range Communication (DSRC) device use becomes mandatory. The system s reliance on continuous communication, however, provides a potential means for unscrupulous persons to transmit false data in an attempt to cause crashes, create traffic congestion, or simply render the system useless. V2V communications must be highly scalable while retaining robust security and privacy preserving features to meet the intra-vehicle and vehicle-to-infrastructure communication requirements for a growing vehicle population. Oakridge National Research Laboratory is investigating a Vehicle-Based Security System (VBSS) to provide security and privacy for a fully deployed V2V and V2I system. In the VBSS an On-board Unit (OBU) generates short-term certificates and signs Basic Safety Messages (BSM) to preserve privacy and enhance security. This work outlines a potential VBSS structure and its operational concepts; it examines how a vehicle-based system might feasibly provide security and privacy, highlights remaining challenges, and explores potential mitigations to address those challenges. Certificate management alternatives that attempt to meet V2V security and privacy requirements have been examined previously by the research community including privacy-preserving group certificates, shared certificates, and functional encryption. Due to real-world operational constraints, adopting one of these approaches for VBSS V2V communication is difficult. Timely misbehavior detection and revocation are still open problems for any V2V system. We explore the alternative approaches that may be

  13. Model Based Analysis and Test Generation for Flight Software

    Science.gov (United States)

    Pasareanu, Corina S.; Schumann, Johann M.; Mehlitz, Peter C.; Lowry, Mike R.; Karsai, Gabor; Nine, Harmon; Neema, Sandeep

    2009-01-01

    We describe a framework for model-based analysis and test case generation in the context of a heterogeneous model-based development paradigm that uses and combines Math- Works and UML 2.0 models and the associated code generation tools. This paradigm poses novel challenges to analysis and test case generation that, to the best of our knowledge, have not been addressed before. The framework is based on a common intermediate representation for different modeling formalisms and leverages and extends model checking and symbolic execution tools for model analysis and test case generation, respectively. We discuss the application of our framework to software models for a NASA flight mission.

  14. Systematic risk analysis: first steps towards a new definition of beta

    OpenAIRE

    Michel Fliess; Cédric Join

    2009-01-01

    International audience; We suggest a new model-free definition of the beta coefficient, which plays an important rôle in systematic risk management. This setting, which is based on the existence of trends for financial time series via nonstandard analysis (Fliess M., Join C.: A mathematical proof of the existence of trends in financial time series, Proc. Int. Conf. Systems Theory: Modelling, Analysis and Control, Fes, 2009, online: http://hal.inria.fr/inria-00352834/en/) leads to convincing c...

  15. Model-Based Reasoning in Humans Becomes Automatic with Training.

    Directory of Open Access Journals (Sweden)

    Marcos Economides

    2015-09-01

    Full Text Available Model-based and model-free reinforcement learning (RL have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load--a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders.

  16. Model-Based Reasoning in Humans Becomes Automatic with Training

    Science.gov (United States)

    Lübbert, Annika; Guitart-Masip, Marc; Dolan, Raymond J.

    2015-01-01

    Model-based and model-free reinforcement learning (RL) have been suggested as algorithmic realizations of goal-directed and habitual action strategies. Model-based RL is more flexible than model-free but requires sophisticated calculations using a learnt model of the world. This has led model-based RL to be identified with slow, deliberative processing, and model-free RL with fast, automatic processing. In support of this distinction, it has recently been shown that model-based reasoning is impaired by placing subjects under cognitive load—a hallmark of non-automaticity. Here, using the same task, we show that cognitive load does not impair model-based reasoning if subjects receive prior training on the task. This finding is replicated across two studies and a variety of analysis methods. Thus, task familiarity permits use of model-based reasoning in parallel with other cognitive demands. The ability to deploy model-based reasoning in an automatic, parallelizable fashion has widespread theoretical implications, particularly for the learning and execution of complex behaviors. It also suggests a range of important failure modes in psychiatric disorders. PMID:26379239

  17. Analysis of composition-based metagenomic classification.

    Science.gov (United States)

    Higashi, Susan; Barreto, André da Motta Salles; Cantão, Maurício Egidio; de Vasconcelos, Ana Tereza Ribeiro

    2012-01-01

    An essential step of a metagenomic study is the taxonomic classification, that is, the identification of the taxonomic lineage of the organisms in a given sample. The taxonomic classification process involves a series of decisions. Currently, in the context of metagenomics, such decisions are usually based on empirical studies that consider one specific type of classifier. In this study we propose a general framework for analyzing the impact that several decisions can have on the classification problem. Instead of focusing on any specific classifier, we define a generic score function that provides a measure of the difficulty of the classification task. Using this framework, we analyze the impact of the following parameters on the taxonomic classification problem: (i) the length of n-mers used to encode the metagenomic sequences, (ii) the similarity measure used to compare sequences, and (iii) the type of taxonomic classification, which can be conventional or hierarchical, depending on whether the classification process occurs in a single shot or in several steps according to the taxonomic tree. We defined a score function that measures the degree of separability of the taxonomic classes under a given configuration induced by the parameters above. We conducted an extensive computational experiment and found out that reasonable values for the parameters of interest could be (i) intermediate values of n, the length of the n-mers; (ii) any similarity measure, because all of them resulted in similar scores; and (iii) the hierarchical strategy, which performed better in all of the cases. As expected, short n-mers generate lower configuration scores because they give rise to frequency vectors that represent distinct sequences in a similar way. On the other hand, large values for n result in sparse frequency vectors that represent differently metagenomic fragments that are in fact similar, also leading to low configuration scores. Regarding the similarity measure, in

  18. A model-free temperature-dependent conformational study of n-pentane in nematic liquid crystals

    International Nuclear Information System (INIS)

    Burnell, E. Elliott; Weber, Adrian C. J.; Dong, Ronald Y.; Meerts, W. Leo; Lange, Cornelis A. de

    2015-01-01

    The proton NMR spectra of n-pentane orientationally ordered in two nematic liquid-crystal solvents are studied over a wide temperature range and analysed using covariance matrix adaptation evolutionary strategy. Since alkanes possess small electrostatic moments, their anisotropic intermolecular interactions are dominated by short-range size-and-shape effects. As we assumed for n-butane, the anisotropic energy parameters of each n-pentane conformer are taken to be proportional to those of ethane and propane, independent of temperature. The observed temperature dependence of the n-pentane dipolar couplings allows a model-free separation between conformer degrees of order and conformer probabilities, which cannot be achieved at a single temperature. In this way for n-pentane 13 anisotropic energy parameters (two for trans trans, tt, five for trans gauche, tg, and three for each of gauche + gauche + , pp, and gauche + gauche − , pm), the isotropic trans-gauche energy difference E tg and its temperature coefficient E tg ′ are obtained. The value obtained for the extra energy associated with the proximity of the two methyl groups in the gauche + gauche − conformers (the pentane effect) is sensitive to minute details of other assumptions and is thus fixed in the calculations. Conformer populations are affected by the environment. In particular, anisotropic interactions increase the trans probability in the ordered phase

  19. A modified principal component analysis-based utility theory ...

    African Journals Online (AJOL)

    user

    modified PCA-based utility theory (UT) approach for optimization of correlated responses. ... Keywords: EDM; Correlated responses; Optimization; Principal component analysis; Proportion of quality loss reduction; ...... On stochastic optimization: Taguchi MethodsTM demystified; its limitations and fallacy clarified.

  20. A modified principal component analysis-based utility theory ...

    African Journals Online (AJOL)

    traditional machining processes having multiple performance characteristics, some of which are usually correlated. So, ideally, use of principal component analysis (PCA)-based approaches that take into account the possible correlations ...

  1. Synthesis and spectroscopic analysis of Schiff Bases of Imesatin ...

    African Journals Online (AJOL)

    Synthesis and spectroscopic analysis of Schiff Bases of Imesatin and Isatin derivatives. Olubunmi S. Oguntoye, Abdulmumeen A. Hamid, Gabriel S. Iloka, Sunday O. Bodede, Samson O. Owalude, Adedibu C. Tella ...

  2. Risk-based decision analysis for groundwater operable units

    International Nuclear Information System (INIS)

    Chiaramonte, G.R.

    1995-01-01

    This document proposes a streamlined approach and methodology for performing risk assessment in support of interim remedial measure (IRM) decisions involving the remediation of contaminated groundwater on the Hanford Site. This methodology, referred to as ''risk-based decision analysis,'' also supports the specification of target cleanup volumes and provides a basis for design and operation of the groundwater remedies. The risk-based decision analysis can be completed within a short time frame and concisely documented. The risk-based decision analysis is more versatile than the qualitative risk assessment (QRA), because it not only supports the need for IRMs, but also provides criteria for defining the success of the IRMs and provides the risk-basis for decisions on final remedies. For these reasons, it is proposed that, for groundwater operable units, the risk-based decision analysis should replace the more elaborate, costly, and time-consuming QRA

  3. Adding trend data to Depletion-Based Stock Reduction Analysis

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — A Bayesian model of Depletion-Based Stock Reduction Analysis (DB-SRA), informed by a time series of abundance indexes, was developed, using the Sampling Importance...

  4. An Artificial Intelligence-Based Environment Quality Analysis System

    OpenAIRE

    Oprea , Mihaela; Iliadis , Lazaros

    2011-01-01

    Part 20: Informatics and Intelligent Systems Applications for Quality of Life information Services (ISQLIS) Workshop; International audience; The paper describes an environment quality analysis system based on a combination of some artificial intelligence techniques, artificial neural networks and rule-based expert systems. Two case studies of the system use are discussed: air pollution analysis and flood forecasting with their impact on the environment and on the population health. The syste...

  5. Content Analysis of a Computer-Based Faculty Activity Repository

    Science.gov (United States)

    Baker-Eveleth, Lori; Stone, Robert W.

    2013-01-01

    The research presents an analysis of faculty opinions regarding the introduction of a new computer-based faculty activity repository (FAR) in a university setting. The qualitative study employs content analysis to better understand the phenomenon underlying these faculty opinions and to augment the findings from a quantitative study. A web-based…

  6. PYTHON-based Physics Analysis Environment for LHCb

    CERN Document Server

    Belyaev, I; Mato, P; Barrand, G; Tsaregorodtsev, A; de Oliveira, E; Tsaregorodtsev, A Yu; De Oliveira, E

    2004-01-01

    BENDER is the PYTHON based physics analysis application for LHCb. It combines the best features of the underlying GAUDI software architecture with the flexibility of the PYTHON scripting language and provides end-users with a friendly physics analysis oriented environment.

  7. Product Lifecycle Management Architecture: A Model Based Systems Engineering Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Noonan, Nicholas James [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-07-01

    This report is an analysis of the Product Lifecycle Management (PLM) program. The analysis is centered on a need statement generated by a Nuclear Weapons (NW) customer. The need statement captured in this report creates an opportunity for the PLM to provide a robust service as a solution. Lifecycles for both the NW and PLM are analyzed using Model Based System Engineering (MBSE).

  8. Geographic Object-Based Image Analysis: Towards a new paradigm

    NARCIS (Netherlands)

    Blaschke, T.; Hay, G.J.; Kelly, M.; Lang, S.; Hofmann, P.; Addink, E.A.; Queiroz Feitosa, R.; van der Meer, F.D.; van der Werff, H.M.A.; van Coillie, F.; Tiede, A.

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis – GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature

  9. chemical analysis and base- promoted hydrolysis of locally ...

    African Journals Online (AJOL)

    Wara

    ABSTRACT. The study was on the chemical analysis and base- promoted hydrolysis of extracted shea nut fat. The local method of extraction of the shea nut oil was employed in comparison with literature report. A simple cold-process alkali hydrolysis of the shea nut oil was used in producing the soap. The chemical analysis ...

  10. Science Based Governance? EU Food Regulation Submitted to Risk Analysis

    NARCIS (Netherlands)

    Szajkowska, A.; Meulen, van der B.M.J.

    2014-01-01

    Anna Szajkowska and Bernd van der Meulen analyse in their contribution, Science Based Governance? EU Food Regulation Submitted to Risk Analysis, the scope of application of risk analysis and the precautionary principle in EU food safety regulation. To what extent does this technocratic,

  11. Phylogenetic Analysis of Tibetan Mastiffs Based on Mitochondrial ...

    Indian Academy of Sciences (India)

    Navya

    Phylogenetic analysis of Tibetan Mastiffs. RESEARCH ARTICLE. Phylogenetic Analysis of Tibetan Mastiffs Based on. Mitochondrial Hyper variable Region I. Zhanjun Ren*, Huiling Chen, Xuejiao Yang and Chengdong Zhang. College of Animal Science and Technology, Northwest A&F University, Yangling, Shaanxi.

  12. Chemical analysis and base-promoted hydrolysis of locally ...

    African Journals Online (AJOL)

    The study was on the chemical analysis and base- promoted hydrolysis of extracted shea nut fat. The local method of extraction of the shea nut oil was employed in comparison with literature report. A simple cold-process alkali hydrolysis of the shea nut oil was used in producing the soap. The chemical analysis of the oil ...

  13. Some Linguistic-based and temporal analysis on Wikipedia

    International Nuclear Information System (INIS)

    Yasseri, T.

    2010-01-01

    Wikipedia as a web-based, collaborative, multilingual encyclopaedia project is a very suitable field to carry out research on social dynamics and to investigate the complex concepts of conflict, collaboration, competition, dispute, etc in a large community (∼26 Million) of Wikipedia users. The other face of Wikipedia as a productive society, is its output, consisting of (∼17) Millions of articles written unsupervised by unprofessional editors in more than 270 different languages. In this talk we report some analysis performed on Wikipedia in two different approaches: temporal analysis to characterize disputes and controversies among users and linguistic-based analysis to characterize linguistic features of English texts in Wikipedia. (author)

  14. Differential Regulatory Analysis Based on Coexpression Network in Cancer Research

    Directory of Open Access Journals (Sweden)

    Junyi Li

    2016-01-01

    Full Text Available With rapid development of high-throughput techniques and accumulation of big transcriptomic data, plenty of computational methods and algorithms such as differential analysis and network analysis have been proposed to explore genome-wide gene expression characteristics. These efforts are aiming to transform underlying genomic information into valuable knowledges in biological and medical research fields. Recently, tremendous integrative research methods are dedicated to interpret the development and progress of neoplastic diseases, whereas differential regulatory analysis (DRA based on gene coexpression network (GCN increasingly plays a robust complement to regular differential expression analysis in revealing regulatory functions of cancer related genes such as evading growth suppressors and resisting cell death. Differential regulatory analysis based on GCN is prospective and shows its essential role in discovering the system properties of carcinogenesis features. Here we briefly review the paradigm of differential regulatory analysis based on GCN. We also focus on the applications of differential regulatory analysis based on GCN in cancer research and point out that DRA is necessary and extraordinary to reveal underlying molecular mechanism in large-scale carcinogenesis studies.

  15. A scenario-based procedure for seismic risk analysis

    International Nuclear Information System (INIS)

    Kluegel, J.-U.; Mualchin, L.; Panza, G.F.

    2006-12-01

    A new methodology for seismic risk analysis based on probabilistic interpretation of deterministic or scenario-based hazard analysis, in full compliance with the likelihood principle and therefore meeting the requirements of modern risk analysis, has been developed. The proposed methodology can easily be adjusted to deliver its output in a format required for safety analysts and civil engineers. The scenario-based approach allows the incorporation of all available information collected in a geological, seismotectonic and geotechnical database of the site of interest as well as advanced physical modelling techniques to provide a reliable and robust deterministic design basis for civil infrastructures. The robustness of this approach is of special importance for critical infrastructures. At the same time a scenario-based seismic hazard analysis allows the development of the required input for probabilistic risk assessment (PRA) as required by safety analysts and insurance companies. The scenario-based approach removes the ambiguity in the results of probabilistic seismic hazard analysis (PSHA) which relies on the projections of Gutenberg-Richter (G-R) equation. The problems in the validity of G-R projections, because of incomplete to total absence of data for making the projections, are still unresolved. Consequently, the information from G-R must not be used in decisions for design of critical structures or critical elements in a structure. The scenario-based methodology is strictly based on observable facts and data and complemented by physical modelling techniques, which can be submitted to a formalised validation process. By means of sensitivity analysis, knowledge gaps related to lack of data can be dealt with easily, due to the limited amount of scenarios to be investigated. The proposed seismic risk analysis can be used with confidence for planning, insurance and engineering applications. (author)

  16. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  17. Bayesian-network-based safety risk analysis in construction projects

    International Nuclear Information System (INIS)

    Zhang, Limao; Wu, Xianguo; Skibniewski, Miroslaw J.; Zhong, Jingbing; Lu, Yujie

    2014-01-01

    This paper presents a systemic decision support approach for safety risk analysis under uncertainty in tunnel construction. Fuzzy Bayesian Networks (FBN) is used to investigate causal relationships between tunnel-induced damage and its influential variables based upon the risk/hazard mechanism analysis. Aiming to overcome limitations on the current probability estimation, an expert confidence indicator is proposed to ensure the reliability of the surveyed data for fuzzy probability assessment of basic risk factors. A detailed fuzzy-based inference procedure is developed, which has a capacity of implementing deductive reasoning, sensitivity analysis and abductive reasoning. The “3σ criterion” is adopted to calculate the characteristic values of a triangular fuzzy number in the probability fuzzification process, and the α-weighted valuation method is adopted for defuzzification. The construction safety analysis progress is extended to the entire life cycle of risk-prone events, including the pre-accident, during-construction continuous and post-accident control. A typical hazard concerning the tunnel leakage in the construction of Wuhan Yangtze Metro Tunnel in China is presented as a case study, in order to verify the applicability of the proposed approach. The results demonstrate the feasibility of the proposed approach and its application potential. A comparison of advantages and disadvantages between FBN and fuzzy fault tree analysis (FFTA) as risk analysis tools is also conducted. The proposed approach can be used to provide guidelines for safety analysis and management in construction projects, and thus increase the likelihood of a successful project in a complex environment. - Highlights: • A systemic Bayesian network based approach for safety risk analysis is developed. • An expert confidence indicator for probability fuzzification is proposed. • Safety risk analysis progress is extended to entire life cycle of risk-prone events. • A typical

  18. Crystallographically-based analysis of the NMR spectra of maghemite

    International Nuclear Information System (INIS)

    Spiers, K.M.; Cashion, J.D.

    2012-01-01

    All possible iron environments with respect to nearest neighbour vacancies in vacancy-ordered and vacancy-disordered maghemite have been evaluated and used as the foundation for a crystallographically-based analysis of the published NMR spectra of maghemite. The spectral components have been assigned to particular configurations and excellent agreement obtained in comparing predicted spectra with published spectra taken in applied magnetic fields. The broadness of the published NMR lines has been explained by calculations of the magnetic dipole fields at the various iron sites and consideration of the supertransferred hyperfine fields. - Highlights: ► Analysis of 57 Fe NMR of maghemite based on vacancy ordering and nearest neighbour vacancies. ► Assignment of NMR spectral components based on crystallographic analysis of unique iron sites. ► Strong agreement between predicted spectra and published spectra taken in applied magnetic fields. ► Maghemite NMR spectral broadening due to various iron sites and supertransferred hyperfine field.

  19. Matrix-based introduction to multivariate data analysis

    CERN Document Server

    Adachi, Kohei

    2016-01-01

    This book enables readers who may not be familiar with matrices to understand a variety of multivariate analysis procedures in matrix forms. Another feature of the book is that it emphasizes what model underlies a procedure and what objective function is optimized for fitting the model to data. The author believes that the matrix-based learning of such models and objective functions is the fastest way to comprehend multivariate data analysis. The text is arranged so that readers can intuitively capture the purposes for which multivariate analysis procedures are utilized: plain explanations of the purposes with numerical examples precede mathematical descriptions in almost every chapter. This volume is appropriate for undergraduate students who already have studied introductory statistics. Graduate students and researchers who are not familiar with matrix-intensive formulations of multivariate data analysis will also find the book useful, as it is based on modern matrix formulations with a special emphasis on ...

  20. European Climate - Energy Security Nexus. A model based scenario analysis

    International Nuclear Information System (INIS)

    Criqui, Patrick; Mima, Silvana

    2011-01-01

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  1. European Climate - Energy Security Nexus. A model based scenario analysis

    Energy Technology Data Exchange (ETDEWEB)

    Criqui, Patrick; Mima, Silvana

    2011-01-15

    In this research, we have provided an overview of the climate-security nexus in the European sector through a model based scenario analysis with POLES model. The analysis underline that under stringent climate policies, Europe take advantage of a double dividend in its capacity to develop a new cleaner energy model and in lower vulnerability to potential shocks on the international energy markets. (authors)

  2. Managerial Methods Based on Analysis, Recommended to a Boarding House

    Directory of Open Access Journals (Sweden)

    Solomia Andreş

    2015-06-01

    Full Text Available The paper presents a few theoretical and practical contributions regarding the implementing of analysis based methods, respectively a SWOT and an economic analysis, from the perspective and the demands of a firm management which functions with profits due to the activity of a boarding house. The two types of managerial methods recommended to the firm offer real and complex information necessary for the knowledge of the firm status and the elaboration of prediction for the maintaining of business viability.

  3. Fast Template-based Shape Analysis using Diffeomorphic Iterative Centroid

    OpenAIRE

    Cury , Claire; Glaunès , Joan Alexis; Chupin , Marie; Colliot , Olivier

    2014-01-01

    International audience; A common approach for the analysis of anatomical variability relies on the estimation of a representative template of the population, followed by the study of this population based on the parameters of the deformations going from the template to the population. The Large Deformation Diffeomorphic Metric Mapping framework is widely used for shape analysis of anatomical structures, but computing a template with such framework is computationally expensive. In this paper w...

  4. Using the DOE Knowledge Base for Special Event Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, H.M.; Harris, J.M.; Young, C.J.

    1998-10-20

    The DOE Knowledge Base is a library of detailed information whose purpose is to support the United States National Data Center (USNDC) in its mission to monitor compliance with the Comprehensive Test Ban Treaty (CTBT). One of the important tasks which the USNDC must accomplish is to periodically perform detailed analysis of events of high interest, so-called "Special Events", to provide the national authority with information needed to make policy decisions. In this paper we investigate some possible uses of the Knowledge Base for Special Event Analysis (SEA), and make recommendations for improving Knowledge Base support for SEA. To analyze an event in detail, there are two basic types of data which must be used sensor-derived data (wave- forms, arrivals, events, etc.) and regiohalized contextual data (known sources, geological characteristics, etc.). Cur- rently there is no single package which can provide full access to both types of data, so for our study we use a separate package for each MatSeis, the Sandia Labs-developed MATLAB-based seismic analysis package, for wave- form data analysis, and ArcView, an ESRI product, for contextual data analysis. Both packages are well-suited to pro- totyping because they provide a rich set of currently available functionality and yet are also flexible and easily extensible, . Using these tools and Phase I Knowledge Base data sets, we show how the Knowledge Base can improve both the speed and the quality of SEA. Empirically-derived interpolated correction information can be accessed to improve both location estimates and associated error estimates. This information can in turn be used to identi~ any known nearby sources (e.g. mines, volcanos), which may then trigger specialized processing of the sensor data. Based on the location estimate, preferred magnitude formulas and discriminants can be retrieved, and any known blockages can be identified to prevent miscalculations. Relevant historic events can be identilled either by

  5. Digital Printing Quality Detection and Analysis Technology Based on CCD

    Science.gov (United States)

    He, Ming; Zheng, Liping

    2017-12-01

    With the help of CCD digital printing quality detection and analysis technology, it can carry out rapid evaluation and objective detection of printing quality, and can play a certain control effect on printing quality. It can be said CDD digital printing quality testing and analysis of the rational application of technology, its digital printing and printing materials for a variety of printing equipments to improve the quality of a very positive role. In this paper, we do an in-depth study and discussion based on the CCD digital print quality testing and analysis technology.

  6. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  7. Reliability-Based Robustness Analysis for a Croatian Sports Hall

    DEFF Research Database (Denmark)

    Čizmar, Dean; Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    2011-01-01

    This paper presents a probabilistic approach for structural robustness assessment for a timber structure built a few years ago. The robustness analysis is based on a structural reliability based framework for robustness and a simplified mechanical system modelling of a timber truss system...... is expressed and evaluated by a robustness index. Next, the robustness is assessed using system reliability indices where the probabilistic failure model is modelled by a series system of parallel systems....

  8. Open access for ALICE analysis based on virtualization technology

    CERN Document Server

    Buncic, P; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modi...

  9. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna

    2013-01-01

    kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...... to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified....

  10. Estimation of scientific publications’ structure in accounting and analysis based on the international data bases

    OpenAIRE

    Олійник, Оксана Вікторівна

    2016-01-01

    Total analysis of dynamics and structure of publications in accounting and analysis based on flows of foreign scientific publications in economics, presented in the subject classification of American Association of Economists, abstract magazine “Journal of Economic Literature” (JEL) and Social Science Research Network have been carried out.

  11. Web-based visual analysis for high-throughput genomics.

    Science.gov (United States)

    Goecks, Jeremy; Eberhard, Carl; Too, Tomithy; Nekrutenko, Anton; Taylor, James

    2013-06-13

    Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high

  12. Nuclear power company activity based costing management analysis

    International Nuclear Information System (INIS)

    Xu Dan

    2012-01-01

    With Nuclear Energy Industry development, Nuclear Power Company has the continual promoting stress of inner management to the sustainable marketing operation development. In view of this, it is very imminence that Nuclear Power Company should promote the cost management levels and built the nuclear safety based lower cost competitive advantage. Activity based costing management (ABCM) transfer the cost management emphases from the 'product' to the 'activity' using the value chain analysis methods, cost driver analysis methods and so on. According to the analysis of the detail activities and the value chains, cancel the unnecessary activity, low down the resource consuming of the necessary activity, and manage the cost from the source, achieve the purpose of reducing cost, boosting efficiency and realizing the management value. It gets the conclusion from the detail analysis with the nuclear power company procedure and activity, and also with the selection to 'pieces analysis' of the important cost related project in the nuclear power company. The conclusion is that the activities of the nuclear power company has the obviously performance. It can use the management of ABC method. And with the management of the procedure and activity, it is helpful to realize the nuclear safety based low cost competitive advantage in the nuclear power company. (author)

  13. Modeling and Grid impedance Variation Analysis of Parallel Connected Grid Connected Inverter based on Impedance Based Harmonic Analysis

    DEFF Research Database (Denmark)

    Kwon, JunBum; Wang, Xiongfei; Bak, Claus Leth

    2014-01-01

    This paper addresses the harmonic compensation error problem existing with parallel connected inverter in the same grid interface conditions by means of impedance-based analysis and modeling. Unlike the single grid connected inverter, it is found that multiple parallel connected inverters and grid...... impedance can make influence to each other if they each have a harmonic compensation function. The analysis method proposed in this paper is based on the relationship between the overall output impedance and input impedance of parallel connected inverter, where controller gain design method, which can...

  14. Single base pair mutation analysis by PNA directed PCR clamping

    DEFF Research Database (Denmark)

    Ørum, H.; Nielsen, P.E.; Egholm, M.

    1993-01-01

    A novel method that allows direct analysis of single base mutation by the polymerase chain reaction (PCR) is described. The method utilizes the finding that PNAs (peptide nucleic acids) recognize and bind to their complementary nucleic acid sequences with higher thermal stability and specificity...

  15. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  16. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Nunes Leal Franqueira, V.; Tun, Thein Tan; Wieringa, Roelf J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about

  17. Graph- versus Vector-Based Analysis of a Consensus Protocol

    NARCIS (Netherlands)

    Delzanno, Giorgio; Rensink, Arend; Traverso, Riccardo; Bošnački, Dragan; Edelkamp, Stefan; Lluch Lafuente, Alberto; Wijs, Anton

    The Paxos distributed consensus algorithm is a challenging case-study for standard, vector-based model checking techniques. Due to asynchronous communication, exhaustive analysis may generate very large state spaces already for small model instances. In this paper, we show the advantages of graph

  18. LES based POD analysis of Jet in Cross Flow

    DEFF Research Database (Denmark)

    Cavar, Dalibor; Meyer, Knud Erik; Jakirlic, S.

    2010-01-01

    The paper presents results of a POD investigation of the LES based numerical simulation of the jet-in-crossflow (JICF) flowfield. LES results are firstly compared to the pointwise LDA measurements. 2D POD analysis is then used as a comparison basis for PIV measurements and LES, and finally 3D POD...

  19. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  20. Intelligent assembly time analysis, using a digital knowledge based approach

    NARCIS (Netherlands)

    Jin, Y.; Curran, R.; Butterfield, J.; Burke, R.; Welch, B.

    2009-01-01

    The implementation of effective time analysis methods fast and accurately in the era of digital manufacturing has become a significant challenge for aerospace manufacturers hoping to build and maintain a competitive advantage. This paper proposes a structure oriented, knowledge-based approach for

  1. Potential density and tree survival: an analysis based on South ...

    African Journals Online (AJOL)

    Finally, we present a tree survival analysis, based on the Weibull distribution function, for the Nelshoogte replicated CCT study, which has been observed for almost 40 years after planting and provides information about tree survival in response to planting espacements ranging from 494 to 2 965 trees per hectare.

  2. Discrete Discriminant analysis based on tree-structured graphical models

    DEFF Research Database (Denmark)

    Perez de la Cruz, Gonzalo; Eslava, Guillermina

    The purpose of this paper is to illustrate the potential use of discriminant analysis based on tree{structured graphical models for discrete variables. This is done by comparing its empirical performance using estimated error rates for real and simulated data. The results show that discriminant...

  3. Synthesis and Spectroscopic Analysis of Schiff Bases of Imesatin ...

    African Journals Online (AJOL)

    ADOWIE PERE

    Keywords: Schiff bases, isatin, imesatin, spectroscopic analysis, biological activity. Isatin (1H-indole-2, 3-Dione) was first synthesized by. Erdman, 1840 and established by Laurent, 1841 as a product from the oxidation of indigo by nitric and chromic acids. The synthetic versatility of Isatin has led to the wide applications of ...

  4. Study and analysis of wavelet based image compression techniques ...

    African Journals Online (AJOL)

    This paper presented comprehensive study with performance analysis of very recent Wavelet transform based image compression techniques. Image compression is one of the necessities for such communication. The goals of image compression are to minimize the storage requirement and communication bandwidth.

  5. Spinoza II: Conceptual Case-Based Natural Language Analysis.

    Science.gov (United States)

    Schank, Roger C.; And Others

    This paper presents the theoretical changes that have developed in Conceptual Dependency Theory and their ramifications in computer analysis of natural language. The major items of concern are: the elimination of reliance on "grammar rules" for parsing with the emphasis given to conceptual rule based parsing; the development of a…

  6. Analysis of a Lorentz force based vibration exciter using permanent ...

    Indian Academy of Sciences (India)

    This work presents performance analysis of a Lorentz force based noncontact vibration exciter by mounting a couple of permanent magnets on a piezoelectric stack. A conductor is attached to the structure to be excited and is placed midway between unlike poles of a couple of permanent magnets. The permanent magnets ...

  7. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    NARCIS (Netherlands)

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated

  8. Likelihood-based confidence intervals in exploratory factor analysis

    NARCIS (Netherlands)

    Oort, F.J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated

  9. Likelihood-Based Confidence Intervals in Exploratory Factor Analysis

    Science.gov (United States)

    Oort, Frans J.

    2011-01-01

    In exploratory or unrestricted factor analysis, all factor loadings are free to be estimated. In oblique solutions, the correlations between common factors are free to be estimated as well. The purpose of this article is to show how likelihood-based confidence intervals can be obtained for rotated factor loadings and factor correlations, by…

  10. Knowledge-based analysis of functional impacts of mutations in ...

    Indian Academy of Sciences (India)

    2015-09-28

    Sep 28, 2015 ... pathway' and 'Epidermal growth factor receptor signaling pathway' were significantly and differentially enriched by the two sets of allele-specific target genes of miRNA hsa-miR-96. [Bhattacharya A and Cui Y 2015 Knowledge-based analysis of functional impacts of mutations in microRNA seed regions.

  11. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  12. Vision-based human motion analysis: An overview

    NARCIS (Netherlands)

    Poppe, Ronald Walter

    2007-01-01

    Markerless vision-based human motion analysis has the potential to provide an inexpensive, non-obtrusive solution for the estimation of body poses. The significant research effort in this domain has been motivated by the fact that many application areas, including surveillance, Human-Computer

  13. A Gradient Analysis-Based Study of Aeromagnetic Anomalies of ...

    African Journals Online (AJOL)

    An aeromagnetic intensity contour map of a part of Nupe Basin of Nigeria was acquired, digitized and analysed. This work was carried out for a better understanding of the study area using the Gradient analysis-based technique to calculate depth to basement and to interpret the aeromagnetic anomaly map of the area.

  14. Modal Analysis Based on the Random Decrement Technique

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Brincker, Rune

    1998-01-01

    This article describes the work carried out within the project: Modal Analysis Based on the Random Decrement Technique - Application to Civil Engineering Structures. The project is part of the research programme: Dynamics of Structures sponsored by the Danish Technical Research Counsil. The planned...

  15. Analysis of Computer Network Information Based on "Big Data"

    Science.gov (United States)

    Li, Tianli

    2017-11-01

    With the development of the current era, computer network and large data gradually become part of the people's life, people use the computer to provide convenience for their own life, but at the same time there are many network information problems has to pay attention. This paper analyzes the information security of computer network based on "big data" analysis, and puts forward some solutions.

  16. System of gait analysis based on ground reaction force assessment

    Directory of Open Access Journals (Sweden)

    František Vaverka

    2015-12-01

    Full Text Available Background: Biomechanical analysis of gait employs various methods used in kinematic and kinetic analysis, EMG, and others. One of the most frequently used methods is kinetic analysis based on the assessment of the ground reaction forces (GRF recorded on two force plates. Objective: The aim of the study was to present a method of gait analysis based on the assessment of the GRF recorded during the stance phase of two steps. Methods: The GRF recorded with a force plate on one leg during stance phase has three components acting in directions: Fx - mediolateral, Fy - anteroposterior, and Fz - vertical. A custom-written MATLAB script was used for gait analysis in this study. This software displays instantaneous force data for both legs as Fx(t, Fy(t and Fz(t curves, automatically determines the extremes of functions and sets the visual markers defining the individual points of interest. Positions of these markers can be easily adjusted by the rater, which may be necessary if the GRF has an atypical pattern. The analysis is fully automated and analyzing one trial takes only 1-2 minutes. Results: The method allows quantification of temporal variables of the extremes of the Fx(t, Fy(t, Fz(t functions, durations of the braking and propulsive phase, duration of the double support phase, the magnitudes of reaction forces in extremes of measured functions, impulses of force, and indices of symmetry. The analysis results in a standardized set of 78 variables (temporal, force, indices of symmetry which can serve as a basis for further research and diagnostics. Conclusions: The resulting set of variable offers a wide choice for selecting a specific group of variables with consideration to a particular research topic. The advantage of this method is the standardization of the GRF analysis, low time requirements allowing rapid analysis of a large number of trials in a short time, and comparability of the variables obtained during different research measurements.

  17. Flow cytometry-based DNA hybridization and polymorphism analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cai, H.; Kommander, K.; White, P.S.; Nolan, J.P.

    1998-07-01

    Functional analysis of the humane genome, including the quantification of differential gene expression and the identification of polymorphic sites and disease genes, is an important element of the Human Genome Project. Current methods of analysis are mainly gel-based assays that are not well-suited to rapid genome-scale analyses. To analyze DNA sequence on a large scale, robust and high throughput assays are needed. The authors are developing a suite of microsphere-based approaches employing fluorescence detection to screen and analyze genomic sequence. The approaches include competitive DNA hybridization to measure DNA or RNA targets in unknown samples, and oligo ligation or extension assays to analyze single-nucleotide polymorphisms. Apart from the advances of sensitivity, simplicity, and low sample consumption, these flow cytometric approaches have the potential for high throughput multiplexed analysis using multicolored microspheres and automated sample handling.

  18. Control volume based hydrocephalus research; analysis of human data

    Science.gov (United States)

    Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer

    2010-11-01

    Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.

  19. Teaching-Learning Activity Modeling Based on Data Analysis

    Directory of Open Access Journals (Sweden)

    Kyungrog Kim

    2015-03-01

    Full Text Available Numerous studies are currently being carried out on personalized services based on data analysis to find and provide valuable information about information overload. Furthermore, the number of studies on data analysis of teaching-learning activities for personalized services in the field of teaching-learning is increasing, too. This paper proposes a learning style recency-frequency-durability (LS-RFD model for quantified analysis on the level of activities of learners, to provide the elements of teaching-learning activities according to the learning style of the learner among various parameters for personalized service. This is to measure preferences as to teaching-learning activity according to recency, frequency and durability of such activities. Based on the results, user characteristics can be classified into groups for teaching-learning activity by categorizing the level of preference and activity of the learner.

  20. Quantitative seafloor characterization using angular backscatter data of the multi-beam echo-sounding system - Use of models and model free techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.

    , International Conference on Coastal and Ocean Technology, pp. 293-300 QUANTITATIVE SEAFLOOR CHARACTERIZATION USING ANGULAR BACKSCATTER DATA OF THE MULTI-BEAM ECHO-SOUNDING SYSTEM- USE OF MODELS AND MODEL FREE TECHNIQUES Blshwajit Chakraborty National Institute... of the seafloor features, including textual parameters [1]. Presently available multi-beam echo-sounding techniques can provide bathymetric data with higher coverage, due to the use of faster, high-resolution signal processing techniques employed in the beam...

  1. Content-based analysis and indexing of sports video

    Science.gov (United States)

    Luo, Ming; Bai, Xuesheng; Xu, Guang-you

    2001-12-01

    An explosion of on-line image and video data in digital form is already well underway. With the exponential rise in interactive information exploration and dissemination through the World-Wide Web, the major inhibitors of rapid access to on-line video data are the management of capture and storage, and content-based intelligent search and indexing techniques. This paper proposes an approach for content-based analysis and event-based indexing of sports video. It includes a novel method to organize shots - classifying shots as close shots and far shots, an original idea of blur extent-based event detection, and an innovative local mutation-based algorithm for caption detection and retrieval. Results on extensive real TV programs demonstrate the applicability of our approach.

  2. Laser-Based Lighting: Experimental Analysis and Perspectives.

    Science.gov (United States)

    Trivellin, Nicola; Yushchenko, Maksym; Buffolo, Matteo; De Santi, Carlo; Meneghini, Matteo; Meneghesso, Gaudenzio; Zanoni, Enrico

    2017-10-11

    This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time) we present a detailed comparison between three different solutions for laser lighting, based on (i) transmissive phosphor layers; (ii) a reflective/angled phosphor layer; and (iii) a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes.

  3. Laser-Based Lighting: Experimental Analysis and Perspectives

    Directory of Open Access Journals (Sweden)

    Nicola Trivellin

    2017-10-01

    Full Text Available This paper presents an extensive analysis of the operating principles, theoretical background, advantages and limitations of laser-based lighting systems. In the first part of the paper we discuss the main advantages and issues of laser-based lighting, and present a comparison with conventional LED-lighting technology. In the second part of the paper, we present original experimental data on the stability and reliability of phosphor layers for laser lighting, based on high light-intensity and high-temperature degradation tests. In the third part of the paper (for the first time we present a detailed comparison between three different solutions for laser lighting, based on (i transmissive phosphor layers; (ii a reflective/angled phosphor layer; and (iii a parabolic reflector, by discussing the advantages and drawbacks of each approach. The results presented within this paper can be used as a guideline for the development of advanced lighting systems based on laser diodes.

  4. Environmental Assessment: General Plan-Based Environmental Impact Analysis Process, Laughlin Air Force Base

    Science.gov (United States)

    2007-05-01

    BASED ENVIROMENTAL IMPACT ANALYSIS PROCESS LAUGHLIN AIR FORCE BASE, TEXAS AGENCY: 47th Flying Training Wing (FTW), Laughlin Air Force Base (AFB), Texas...in their decision-making process. The President’s Council on Environmental Quality (CEQ) has issued regulations to implement NEPA that include...Order (EO) 12898, Federal Actions to Address Environmental Justice in Minority Populations and Low-Income Populations, was issued by the President on

  5. Effects of Interventions Based in Behavior Analysis on Motor Skill Acquisition: A Meta-Analysis

    Science.gov (United States)

    Alstot, Andrew E.; Kang, Minsoo; Alstot, Crystal D.

    2013-01-01

    Techniques based in applied behavior analysis (ABA) have been shown to be useful across a variety of settings to improve numerous behaviors. Specifically within physical activity settings, several studies have examined the effect of interventions based in ABA on a variety of motor skills, but the overall effects of these interventions are unknown.…

  6. Intelligent data analysis based on rough correlativity matrix

    Science.gov (United States)

    Geng, Zhiqiang; Zhu, Qunxiong

    2003-09-01

    This paper proposes a new data analysis method based on rough sets by rough correlativity matrix. In rough set theory, a table called information system or database is used as a special kind of formal language to represent knowledge, a rough correlativity matrix (RCM) can be seen as an internal representation of equivalence relations. Furthermore, this paper provides a new heuristic attributes reduction algorithm based on matrix computing, such as using matrix correlative implements to replace the relations computing between sets. Finally the paper adopts information transition matrix (ITM) of information theory to represent the certainty or uncertainty decision rules based on probability theory, namely, the information matrix composed of certainty factors gives the degree of belief of decision rules, on the contrary the "invert" ITM composed of coverage factor gives the interpretation of decision rules. The result of instance analysis is shown that it is an efficient and feasible method to deal with decision information table.

  7. Geographical classification of Epimedium based on HPLC fingerprint analysis combined with multi-ingredients quantitative analysis.

    Science.gov (United States)

    Xu, Ning; Zhou, Guofu; Li, Xiaojuan; Lu, Heng; Meng, Fanyun; Zhai, Huaqiang

    2017-05-01

    A reliable and comprehensive method for identifying the origin and assessing the quality of Epimedium has been developed. The method is based on analysis of HPLC fingerprints, combined with similarity analysis, hierarchical cluster analysis (HCA), principal component analysis (PCA) and multi-ingredient quantitative analysis. Nineteen batches of Epimedium, collected from different areas in the western regions of China, were used to establish the fingerprints and 18 peaks were selected for the analysis. Similarity analysis, HCA and PCA all classified the 19 areas into three groups. Simultaneous quantification of the five major bioactive ingredients in the Epimedium samples was also carried out to confirm the consistency of the quality tests. These methods were successfully used to identify the geographical origin of the Epimedium samples and to evaluate their quality. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Thermodynamics-based Metabolite Sensitivity Analysis in metabolic networks.

    Science.gov (United States)

    Kiparissides, A; Hatzimanikatis, V

    2017-01-01

    The increasing availability of large metabolomics datasets enhances the need for computational methodologies that can organize the data in a way that can lead to the inference of meaningful relationships. Knowledge of the metabolic state of a cell and how it responds to various stimuli and extracellular conditions can offer significant insight in the regulatory functions and how to manipulate them. Constraint based methods, such as Flux Balance Analysis (FBA) and Thermodynamics-based flux analysis (TFA), are commonly used to estimate the flow of metabolites through genome-wide metabolic networks, making it possible to identify the ranges of flux values that are consistent with the studied physiological and thermodynamic conditions. However, unless key intracellular fluxes and metabolite concentrations are known, constraint-based models lead to underdetermined problem formulations. This lack of information propagates as uncertainty in the estimation of fluxes and basic reaction properties such as the determination of reaction directionalities. Therefore, knowledge of which metabolites, if measured, would contribute the most to reducing this uncertainty can significantly improve our ability to define the internal state of the cell. In the present work we combine constraint based modeling, Design of Experiments (DoE) and Global Sensitivity Analysis (GSA) into the Thermodynamics-based Metabolite Sensitivity Analysis (TMSA) method. TMSA ranks metabolites comprising a metabolic network based on their ability to constrain the gamut of possible solutions to a limited, thermodynamically consistent set of internal states. TMSA is modular and can be applied to a single reaction, a metabolic pathway or an entire metabolic network. This is, to our knowledge, the first attempt to use metabolic modeling in order to provide a significance ranking of metabolites to guide experimental measurements. Copyright © 2016 International Metabolic Engineering Society. Published by Elsevier

  9. Kronecker Algebra-based Deadlock Analysis for Railway Systems

    Directory of Open Access Journals (Sweden)

    Robert Mittermayr

    2012-09-01

    Full Text Available Deadlock analysis for railway systems differs in several aspects from deadlock analysis in computer science. While the problem of deadlock analysis for standard computer systems is well-understood, multi-threaded embedded computer systems pose new challenges. A novel approach in this area can easily be applied to deadlock analysis in the domain of railway systems. The approach is based on Kronecker algebra. A lazy implementation of the matrix operations even allows analysing exponentially sized systems in a very efficient manner. The running time of the algorithm does not depend on the problem size but on the size of the solution. While other approaches suffer from the fact that additional constraints make the problem and its solution harder, our approach delivers its results faster if constraints are added. In addition, our approach is complete and sound for railway systems, i.e., it generates neither false positives nor false negatives.

  10. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  11. A computational network analysis based on targets of antipsychotic agents.

    Science.gov (United States)

    Gao, Lei; Feng, Shuo; Liu, Zhao-Yuan; Wang, Jiu-Qiang; Qi, Ke-Ke; Wang, Kai

    2018-03-01

    Currently, numerous antipsychotic agents have been developed in the area of pharmacological treatment of schizophrenia. However, the molecular mechanism underlying multi targets of antipsychotics were yet to be explored. In this study we performed a computational network analysis based on targets of antipsychotic agents. We retrieved a total of 96 targets from 56 antipsychotic agents. By expression enrichment analysis, we identified that the expressions of antipsychotic target genes were significantly enriched in liver, brain, blood and corpus striatum. By protein-protein interaction (PPI) network analysis, a PPI network with 77 significantly interconnected target genes was generated. By historeceptomics analysis, significant brain region specific target-drug interactions were identified in targets of dopamine receptors (DRD1-Olanzapine in caudate nucleus and pons (P-valueantipsychotic targets and insights for molecular mechanism of antipsychotic agents. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Applications of methacrylate-based monolithic supports for speciation analysis.

    Science.gov (United States)

    Scancar, Janez; Milacic, Radmila

    2009-08-01

    Liquid chromatography combined with element specific detection is commonly applied in speciation analysis. In these analyses, to obtain reliable data chemical species should not be transformed. To preserve chemical species during the separation step, fast chromatographic procedures and mild separation conditions are required. Monolithic supports that enable rapid chromatographic separations have rarely been used in speciation analysis. Methacrylate-based anion- and cation-exchange monolithic supports offer separation of charged chemical species of elements and can be used as a complementary tool to particle-packed liquid chromatographic columns. The present paper presents an overview of successful applications of methacrylate-based monolithic supports in speciation of zinc (Zn), chromium (Cr), and aluminium (Al) in environmental, occupational health, and biological samples. Measures of analytical performance of convective interaction media (CIM) monolithic chromatographic supports, namely selectivity, sensitivity, and time of analysis, are compared to those of particle-packed columns. The potential of CIM monolithic chromatography in speciation analysis is critically discussed. Direct comparison of the experimental data in speciation of elements by ion-exchange monolithic and fast protein liquid chromatography is reported for the first time. Finally, some recommendations are given for further investigations of the potential of monolithic chromatography and its implementations in different fields of element speciation analysis.

  13. Barcode server: a visualization-based genome analysis system.

    Directory of Open Access Journals (Sweden)

    Fenglou Mao

    Full Text Available We have previously developed a computational method for representing a genome as a barcode image, which makes various genomic features visually apparent. We have demonstrated that this visual capability has made some challenging genome analysis problems relatively easy to solve. We have applied this capability to a number of challenging problems, including (a identification of horizontally transferred genes, (b identification of genomic islands with special properties and (c binning of metagenomic sequences, and achieved highly encouraging results. These application results inspired us to develop this barcode-based genome analysis server for public service, which supports the following capabilities: (a calculation of the k-mer based barcode image for a provided DNA sequence; (b detection of sequence fragments in a given genome with distinct barcodes from those of the majority of the genome, (c clustering of provided DNA sequences into groups having similar barcodes; and (d homology-based search using Blast against a genome database for any selected genomic regions deemed to have interesting barcodes. The barcode server provides a job management capability, allowing processing of a large number of analysis jobs for barcode-based comparative genome analyses. The barcode server is accessible at http://csbl1.bmb.uga.edu/Barcode.

  14. Data analysis using a data base driven graphics animation system

    International Nuclear Information System (INIS)

    Schwieder, D.H.; Stewart, H.D.; Curtis, J.N.

    1985-01-01

    A graphics animation system has been developed at the Idaho National Engineering Laboratory (INEL) to assist engineers in the analysis of large amounts of time series data. Most prior attempts at computer animation of data involve the development of large and expensive problem-specific systems. This paper discusses a generalized interactive computer animation system designed to be used in a wide variety of data analysis applications. By using relational data base storage of graphics and control information, considerable flexibility in design and development of animated displays is achieved

  15. A microprocessor based picture analysis system for automatic track measurements

    International Nuclear Information System (INIS)

    Heinrich, W.; Trakowski, W.; Beer, J.; Schucht, R.

    1982-01-01

    In the last few years picture analysis became a powerful technique for measurements of nuclear tracks in plastic detectors. For this purpose rather expensive commercial systems are available. Two inexpensive microprocessor based systems with different resolution were developed. The video pictures of particles seen through a microscope are digitized in real time and the picture analysis is done by software. The microscopes are equipped with stages driven by stepping motors, which are controlled by separate microprocessors. A PDP 11/03 supervises the operation of all microprocessors and stores the measured data on its mass storage devices. (author)

  16. FARO base case post-test analysis by COMETA code

    Energy Technology Data Exchange (ETDEWEB)

    Annunziato, A.; Addabbo, C. [Joint Research Centre, Ispra (Italy)

    1995-09-01

    The paper analyzes the COMETA (Core Melt Thermal-Hydraulic Analysis) post test calculations of FARO Test L-11, the so-called Base Case Test. The FARO Facility, located at JRC Ispra, is used to simulate the consequences of Severe Accidents in Nuclear Power Plants under a variety of conditions. The COMETA Code has a 6 equations two phase flow field and a 3 phases corium field: the jet, the droplets and the fused-debris bed. The analysis shown that the code is able to pick-up all the major phenomena occurring during the fuel-coolant interaction pre-mixing phase.

  17. Model-based analysis and simulation of regenerative heat wheel

    DEFF Research Database (Denmark)

    Wu, Zhuang; Melnik, Roderick V. N.; Borup, F.

    2006-01-01

    of mathematical models for the thermal analysis of the fluid and wheel matrix. The effect of heat conduction in the direction of the fluid flow is taken into account and the influence of variations in rotating speed of the wheel as well as other characteristics (ambient temperature, airflow and geometric size......The rotary regenerator (also called the heat wheel) is an important component of energy intensive sectors, which is used in many heat recovery systems. In this paper, a model-based analysis of a rotary regenerator is carried out with a major emphasis given to the development and implementation...

  18. Fatigue Analysis of Automobile Control Arm Based on Ncode

    Directory of Open Access Journals (Sweden)

    Ren Huanmei

    2016-01-01

    Full Text Available In order to improve the vehicle chassis structure durability, the fatigue analysis and optimization design of the low control arm (LCA was taken. A finite element model was established. By using this model, the stress distribution and lowest point of lifetime of the control arm under fatigue load was calculated. Based on the results of analysis, the optimization scheme according to the structure characteristics of components was presented, and a solution to improve the control arm fatigue life was given out. The research provided reference for engineering application of calculation and optimization of chassis components fatigue life.

  19. Parametric Design and Mechanical Analysis of Beams based on SINOVATION

    Science.gov (United States)

    Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.

    2017-07-01

    In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.

  20. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types...

  1. Noodle based analytical devices for cost effective green chemical analysis.

    Science.gov (United States)

    Kiwfo, Kanokwan; Wongwilai, Wasin; Paengnakorn, Pathinan; Boonmapa, Sasithorn; Sateanchok, Suphasinee; Grudpan, Kate

    2018-05-01

    Noodle based analytical devices are proposed for cost effective green chemical analysis. Two noodle based analytical platforms have been examined. Conditions for flow with laminar behaviors could be established. Detection may be via a webcam camera or a flatbed scanner. Acid-base reactions were chosen as a model study. The assays of acetic acid and sodium hydroxide were investigated. Apart from bromothymol blue, simple aqueous extract of butterfly pea flower was used as a natural reagent. Another model was the assay of copper (Cu 2+ ) which was based on the redox reaction of copper (Cu 2+ ) with iodide to produce tri-iodide forming brown/black product with starch which already exists in the noodle platform. Demonstration to apply the noodle platforms for real samples was made. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. YersiniaBase: a genomic resource and analysis platform for comparative analysis of Yersinia.

    Science.gov (United States)

    Tan, Shi Yang; Dutta, Avirup; Jakubovics, Nicholas S; Ang, Mia Yang; Siow, Cheuk Chuen; Mutha, Naresh Vr; Heydari, Hamed; Wee, Wei Yee; Wong, Guat Jah; Choo, Siew Woh

    2015-01-16

    Yersinia is a Gram-negative bacteria that includes serious pathogens such as the Yersinia pestis, which causes plague, Yersinia pseudotuberculosis, Yersinia enterocolitica. The remaining species are generally considered non-pathogenic to humans, although there is evidence that at least some of these species can cause occasional infections using distinct mechanisms from the more pathogenic species. With the advances in sequencing technologies, many genomes of Yersinia have been sequenced. However, there is currently no specialized platform to hold the rapidly-growing Yersinia genomic data and to provide analysis tools particularly for comparative analyses, which are required to provide improved insights into their biology, evolution and pathogenicity. To facilitate the ongoing and future research of Yersinia, especially those generally considered non-pathogenic species, a well-defined repository and analysis platform is needed to hold the Yersinia genomic data and analysis tools for the Yersinia research community. Hence, we have developed the YersiniaBase, a robust and user-friendly Yersinia resource and analysis platform for the analysis of Yersinia genomic data. YersiniaBase has a total of twelve species and 232 genome sequences, of which the majority are Yersinia pestis. In order to smooth the process of searching genomic data in a large database, we implemented an Asynchronous JavaScript and XML (AJAX)-based real-time searching system in YersiniaBase. Besides incorporating existing tools, which include JavaScript-based genome browser (JBrowse) and Basic Local Alignment Search Tool (BLAST), YersiniaBase also has in-house developed tools: (1) Pairwise Genome Comparison tool (PGC) for comparing two user-selected genomes; (2) Pathogenomics Profiling Tool (PathoProT) for comparative pathogenomics analysis of Yersinia genomes; (3) YersiniaTree for constructing phylogenetic tree of Yersinia. We ran analyses based on the tools and genomic data in YersiniaBase and the

  3. Dynamic Chest Image Analysis: Model-Based Perfusion Analysis in Dynamic Pulmonary Imaging

    Directory of Open Access Journals (Sweden)

    Kiuru Aaro

    2003-01-01

    Full Text Available The "Dynamic Chest Image Analysis" project aims to develop model-based computer analysis and visualization methods for showing focal and general abnormalities of lung ventilation and perfusion based on a sequence of digital chest fluoroscopy frames collected with the dynamic pulmonary imaging technique. We have proposed and evaluated a multiresolutional method with an explicit ventilation model for ventilation analysis. This paper presents a new model-based method for pulmonary perfusion analysis. According to perfusion properties, we first devise a novel mathematical function to form a perfusion model. A simple yet accurate approach is further introduced to extract cardiac systolic and diastolic phases from the heart, so that this cardiac information may be utilized to accelerate the perfusion analysis and improve its sensitivity in detecting pulmonary perfusion abnormalities. This makes perfusion analysis not only fast but also robust in computation; consequently, perfusion analysis becomes computationally feasible without using contrast media. Our clinical case studies with 52 patients show that this technique is effective for pulmonary embolism even without using contrast media, demonstrating consistent correlations with computed tomography (CT and nuclear medicine (NM studies. This fluoroscopical examination takes only about 2 seconds for perfusion study with only low radiation dose to patient, involving no preparation, no radioactive isotopes, and no contrast media.

  4. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Fangqin Ren

    2013-02-01

    Full Text Available This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32% based on a database of 280 heart sounds from 40 participants.

  5. Heart sound biometric system based on marginal spectrum analysis.

    Science.gov (United States)

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-02-18

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. 

  6. Heart Sound Biometric System Based on Marginal Spectrum Analysis

    Science.gov (United States)

    Zhao, Zhidong; Shen, Qinqin; Ren, Fangqin

    2013-01-01

    This work presents a heart sound biometric system based on marginal spectrum analysis, which is a new feature extraction technique for identification purposes. This heart sound identification system is comprised of signal acquisition, pre-processing, feature extraction, training, and identification. Experiments on the selection of the optimal values for the system parameters are conducted. The results indicate that the new spectrum coefficients result in a significant increase in the recognition rate of 94.40% compared with that of the traditional Fourier spectrum (84.32%) based on a database of 280 heart sounds from 40 participants. PMID:23429515

  7. Transit Traffic Analysis Zone Delineating Method Based on Thiessen Polygon

    Directory of Open Access Journals (Sweden)

    Shuwei Wang

    2014-04-01

    Full Text Available A green transportation system composed of transit, busses and bicycles could be a significant in alleviating traffic congestion. However, the inaccuracy of current transit ridership forecasting methods is imposing a negative impact on the development of urban transit systems. Traffic Analysis Zone (TAZ delineating is a fundamental and essential step in ridership forecasting, existing delineating method in four-step models have some problems in reflecting the travel characteristics of urban transit. This paper aims to come up with a Transit Traffic Analysis Zone delineation method as supplement of traditional TAZs in transit service analysis. The deficiencies of current TAZ delineating methods were analyzed, and the requirements of Transit Traffic Analysis Zone (TTAZ were summarized. Considering these requirements, Thiessen Polygon was introduced into TTAZ delineating. In order to validate its feasibility, Beijing was then taken as an example to delineate TTAZs, followed by a spatial analysis of office buildings within a TTAZ and transit station departure passengers. Analysis result shows that the TTAZs based on Thiessen polygon could reflect the transit travel characteristic and is of in-depth research value.

  8. Open access for ALICE analysis based on virtualization technology

    International Nuclear Information System (INIS)

    Buncic, P; Gheata, M; Schutz, Y

    2015-01-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system. (paper)

  9. Open access for ALICE analysis based on virtualization technology

    Science.gov (United States)

    Buncic, P.; Gheata, M.; Schutz, Y.

    2015-12-01

    Open access is one of the important leverages for long-term data preservation for a HEP experiment. To guarantee the usability of data analysis tools beyond the experiment lifetime it is crucial that third party users from the scientific community have access to the data and associated software. The ALICE Collaboration has developed a layer of lightweight components built on top of virtualization technology to hide the complexity and details of the experiment-specific software. Users can perform basic analysis tasks within CernVM, a lightweight generic virtual machine, paired with an ALICE specific contextualization. Once the virtual machine is launched, a graphical user interface is automatically started without any additional configuration. This interface allows downloading the base ALICE analysis software and running a set of ALICE analysis modules. Currently the available tools include fully documented tutorials for ALICE analysis, such as the measurement of strange particle production or the nuclear modification factor in Pb-Pb collisions. The interface can be easily extended to include an arbitrary number of additional analysis modules. We present the current status of the tools used by ALICE through the CERN open access portal, and the plans for future extensions of this system.

  10. Job optimization in ATLAS TAG-based distributed analysis

    Science.gov (United States)

    Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.

    2010-04-01

    The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.

  11. HIV/AIDS counseling: analysis based on Paulo Freire.

    Science.gov (United States)

    Miranda, Karla Corrêa Lima; Barroso, Maria Grasiela Teixeira

    2007-01-01

    The study aimed to investigate the strategies health professionals use in HIV/AIDS counseling. This study is a qualitative research, based on Paulo Freire's theory and practice. Bardin's content analysis was used as the analysis technique. For the studied group, the counseling is focused on cognition, although new concepts permeating this subject are emerging. The main difficulties in counseling are related to the clients and the institution. The main facility is related to the team, which according to the group has a good relationship. Counseling represents a moment of distress, especially because it brings up existential questions to the counselor. It can be inferred that counseling is a special moment, but it does not constitute an educational moment yet. To obtain this goal, a counseling methodology is proposed, based on Paulo Freire's principles and concepts.

  12. Controller design based on μ analysis and PSO algorithm.

    Science.gov (United States)

    Lari, Ali; Khosravi, Alireza; Rajabi, Farshad

    2014-03-01

    In this paper an evolutionary algorithm is employed to address the controller design problem based on μ analysis. Conventional solutions to μ synthesis problem such as D-K iteration method often lead to high order, impractical controllers. In the proposed approach, a constrained optimization problem based on μ analysis is defined and then an evolutionary approach is employed to solve the optimization problem. The goal is to achieve a more practical controller with lower order. A benchmark system named two-tank system is considered to evaluate performance of the proposed approach. Simulation results show that the proposed controller performs more effective than high order H(∞) controller and has close responses to the high order D-K iteration controller as the common solution to μ synthesis problem. © 2013 ISA Published by ISA All rights reserved.

  13. Fusing Quantitative Requirements Analysis with Model-based Systems Engineering

    Science.gov (United States)

    Cornford, Steven L.; Feather, Martin S.; Heron, Vance A.; Jenkins, J. Steven

    2006-01-01

    A vision is presented for fusing quantitative requirements analysis with model-based systems engineering. This vision draws upon and combines emergent themes in the engineering milieu. "Requirements engineering" provides means to explicitly represent requirements (both functional and non-functional) as constraints and preferences on acceptable solutions, and emphasizes early-lifecycle review, analysis and verification of design and development plans. "Design by shopping" emphasizes revealing the space of options available from which to choose (without presuming that all selection criteria have previously been elicited), and provides means to make understandable the range of choices and their ramifications. "Model-based engineering" emphasizes the goal of utilizing a formal representation of all aspects of system design, from development through operations, and provides powerful tool suites that support the practical application of these principles. A first step prototype towards this vision is described, embodying the key capabilities. Illustrations, implications, further challenges and opportunities are outlined.

  14. Image-Analysis Based on Seed Phenomics in Sesame

    Directory of Open Access Journals (Sweden)

    Prasad R.

    2014-10-01

    Full Text Available The seed coat (testa structure of twenty-three cultivated (Sesamum indicum L. and six wild sesame (s. occidentale Regel & Heer., S. mulayanum Nair, S. prostratum Retz., S. radiatum Schumach. & Thonn., S. angustifolium (Oliv. Engl. and S. schinzianum Asch germplasm was analyzed from digital and Scanning Electron Microscopy (SEM images with dedicated software using the descriptors for computer based seed image analysis to understand the diversity of seed morphometric traits, which later on can be extended to screen and evaluate improved genotypes of sesame. Seeds of wild sesame species could conveniently be distinguished from cultivated varieties based on shape and architectural analysis. Results indicated discrete ‘cut off values to identify definite shape and contour of seed for a desirable sesame genotype along with the con-ventional practice of selecting lighter colored testa.

  15. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    Computational approaches to social media analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. There are no other unified modelling approaches to social data that integrate...... the conceptual, formal, software, analytical and empirical realms. In this paper, we first present and discuss a theory and conceptual model of social data. Second, we outline a formal model based on fuzzy set theory and describe the operational semantics of the formal model with a real-world social data example...... from Facebook. Third, we briefly present and discuss the Social Data Analytics Tool (SODATO) that realizes the conceptual model in software and provisions social data analysis based on the conceptual and formal models. Fourth, we use SODATO to fetch social data from the facebook wall of a global brand...

  16. GNSS Spoofing Detection Based on Signal Power Measurements: Statistical Analysis

    Directory of Open Access Journals (Sweden)

    V. Dehghanian

    2012-01-01

    Full Text Available A threat to GNSS receivers is posed by a spoofing transmitter that emulates authentic signals but with randomized code phase and Doppler values over a small range. Such spoofing signals can result in large navigational solution errors that are passed onto the unsuspecting user with potentially dire consequences. An effective spoofing detection technique is developed in this paper, based on signal power measurements and that can be readily applied to present consumer grade GNSS receivers with minimal firmware changes. An extensive statistical analysis is carried out based on formulating a multihypothesis detection problem. Expressions are developed to devise a set of thresholds required for signal detection and identification. The detection processing methods developed are further manipulated to exploit incidental antenna motion arising from user interaction with a GNSS handheld receiver to further enhance the detection performance of the proposed algorithm. The statistical analysis supports the effectiveness of the proposed spoofing detection technique under various multipath conditions.

  17. AB019. Erectile dysfunction: analysis based on age stratification

    OpenAIRE

    Kai, Zhang

    2015-01-01

    Objective To establish the profile of erectile dysfunction in different age groups, and analysis the effect of sildenafil based on age stratification. Subjects and Methods From 2007 to 2008, a total of 4,507 men diagnosed with erectile dysfunction (ED) were enrolled from 46 centers in China; 4,039 of these patients were treated with sildenafil and asked to complete the Erectile Function domain of the International Index of Erectile Function, Erection Hardness Score, and Quality of Erection Qu...

  18. Semantic Indexing and Retrieval based on Formal Concept Analysis

    OpenAIRE

    Codocedo , Victor; Lykourentzou , Ioanna; Napoli , Amedeo

    2012-01-01

    Semantic indexing and retrieval has become an important research area, as the available amount of information on the Web is growing more and more. In this paper, we introduce an original approach to semantic indexing and retrieval based on Formal Concept Analysis. The concept lattice is used as a semantic index and we propose an original algorithm for traversing the lattice and answering user queries. This framework has been used and evaluated on song datasets.

  19. Research on Air Quality Evaluation based on Principal Component Analysis

    Science.gov (United States)

    Wang, Xing; Wang, Zilin; Guo, Min; Chen, Wei; Zhang, Huan

    2018-01-01

    Economic growth has led to environmental capacity decline and the deterioration of air quality. Air quality evaluation as a fundamental of environmental monitoring and air pollution control has become increasingly important. Based on the principal component analysis (PCA), this paper evaluates the air quality of a large city in Beijing-Tianjin-Hebei Area in recent 10 years and identifies influencing factors, in order to provide reference to air quality management and air pollution control.

  20. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  1. Comparative analysis of some brushless motors based on catalog data

    Directory of Open Access Journals (Sweden)

    Anton Kalapish

    2005-10-01

    Full Text Available Brushless motors (polyphased AC induction, synchronous and brushless DC motors have no alternatives in modern electric drives. They possess highly efficient and very wide range of speeds. The objective of this paper is to represent some relation between the basic parameters and magnitudes of electrical machines. This allows to be made a comparative analysis and a choice of motor concerning each particular case based not only on catalogue data or price for sale.

  2. A LITERATURE SURVEY ON RECOMMENDATION SYSTEM BASED ON SENTIMENTAL ANALYSIS

    OpenAIRE

    Achin Jain; Vanita Jain; Nidhi Kapoor

    2016-01-01

    Recommender systems have grown to be a critical research subject after the emergence of the first paper on collaborative filtering in the Nineties. Despite the fact that educational studies on recommender systems, has extended extensively over the last 10 years, there are deficiencies in the complete literature evaluation and classification of that research. Because of this, we reviewed articles on recommender structures, and then classified those based on sentiment analysis. The articles are...

  3. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  4. Knowledge-based analysis of functional impacts of mutations in ...

    Indian Academy of Sciences (India)

    Knowledge-based analysis of functional impacts of mutations in microRNA seed regions. Supplementary figure 1. Summary of predicted miRNA targets from TargetScan and miRanda for all the SNPs in miRNA seeds. Percentages of miRNA targets found from only TargetScan, from only miRanda and from both TargetScan ...

  5. Fuzzy-Set Based Sentiment Analysis of Big Social Data

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao; Hussain, Abid; Vatrapu, Ravi

    Computational approaches to social media analytics are largely limited to graph theoretical approaches such as social network analysis (SNA) informed by the social philosophical approach of relational sociology. There are no other unified modelling approaches to social data that integrate...... and actors on the facebook page. Sixth and last, we discuss the analytical method and conclude with a discussion of the benefits of set theoretical approaches based on the social philosophical approach of associational sociology....

  6. Diagnosis methods based on noise analysis at Cernavoda NPP, Romania

    International Nuclear Information System (INIS)

    Banica, Constantin; Dobrea, Dumitru

    1999-01-01

    This paper describes a recent noise analysis of the neutronic signals provided by in-core flux detectors (ICFD) and ion chambers (IC). This analysis is part of on-going program developed for Unit 1 of the Cernavoda NPP, Romania, with the following main objectives: - prediction of detector failures based on pattern recognition; - determination of fast excursions from steady states; - detection of abnormal mechanical vibrations in the reactor core. The introduction presents briefly the reactor, the location of ICFD's and IC's. The second section presents the data acquisition systems and their capabilities. The paper continues with a brief presentation of the numerical methods used for analysis (section 3). The most significant results can be found in section 4, while section 5 concludes about useful information that can be obtained from the neutronic signals at high power steady-state operation. (authors)

  7. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  8. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  9. Facial Affect Recognition Using Regularized Discriminant Analysis-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yuan Shih

    2010-01-01

    Full Text Available This paper presents a novel and effective method for facial expression recognition including happiness, disgust, fear, anger, sadness, surprise, and neutral state. The proposed method utilizes a regularized discriminant analysis-based boosting algorithm (RDAB with effective Gabor features to recognize the facial expressions. Entropy criterion is applied to select the effective Gabor feature which is a subset of informative and nonredundant Gabor features. The proposed RDAB algorithm uses RDA as a learner in the boosting algorithm. The RDA combines strengths of linear discriminant analysis (LDA and quadratic discriminant analysis (QDA. It solves the small sample size and ill-posed problems suffered from QDA and LDA through a regularization technique. Additionally, this study uses the particle swarm optimization (PSO algorithm to estimate optimal parameters in RDA. Experiment results demonstrate that our approach can accurately and robustly recognize facial expressions.

  10. Analysis Of Japans Economy Based On 2014 From Macroeconomics Prospects

    Directory of Open Access Journals (Sweden)

    Dr Mohammad Rafiqul Islam

    2015-02-01

    Full Text Available Abstract Japan is the worlds third largest economy. But currently economic situations of Japan are not stable. It is not increasing as expected. Since 2013 it was world second largest economy but Japan loosed its placed to China in 2014 due to slow growth of important economic indicators. By using the basic Keynesian model we will provide a detailed analysis of the short and long run impacts of the changes for Japans real GDP rate of unemployment and inflation rate. We demonstrated a detailed use of the 45-degree diagram or the AD-IA model and other economic analysis of the macroeconomic principles that underlie the model and concepts. Finally we will recommend the government with a change in fiscal policy what based on the analysis by considering what might be achieved with a fiscal policy response and the extent to which any impact on the stock of public debt might be a consideration

  11. SVM-based glioma grading. Optimization by feature reduction analysis

    International Nuclear Information System (INIS)

    Zoellner, Frank G.; Schad, Lothar R.; Emblem, Kyrre E.; Harvard Medical School, Boston, MA; Oslo Univ. Hospital

    2012-01-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values (∝87%) while reducing the number of features by up to 98%. (orig.)

  12. Dynamic network-based epistasis analysis: Boolean examples

    Directory of Open Access Journals (Sweden)

    Eugenio eAzpeitia

    2011-12-01

    Full Text Available In this review we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the topologies of gene interactions infered. This has been acknowledged in several previous papers and reviews, but here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson (herein, classical epistasis, defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus. Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct gene interaction topologies are hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our review complements previous accounts, not

  13. Dynamic Network-Based Epistasis Analysis: Boolean Examples

    Science.gov (United States)

    Azpeitia, Eugenio; Benítez, Mariana; Padilla-Longoria, Pablo; Espinosa-Soto, Carlos; Alvarez-Buylla, Elena R.

    2011-01-01

    In this article we focus on how the hierarchical and single-path assumptions of epistasis analysis can bias the inference of gene regulatory networks. Here we emphasize the critical importance of dynamic analyses, and specifically illustrate the use of Boolean network models. Epistasis in a broad sense refers to gene interactions, however, as originally proposed by Bateson, epistasis is defined as the blocking of a particular allelic effect due to the effect of another allele at a different locus (herein, classical epistasis). Classical epistasis analysis has proven powerful and useful, allowing researchers to infer and assign directionality to gene interactions. As larger data sets are becoming available, the analysis of classical epistasis is being complemented with computer science tools and system biology approaches. We show that when the hierarchical and single-path assumptions are not met in classical epistasis analysis, the access to relevant information and the correct inference of gene interaction topologies is hindered, and it becomes necessary to consider the temporal dynamics of gene interactions. The use of dynamical networks can overcome these limitations. We particularly focus on the use of Boolean networks that, like classical epistasis analysis, relies on logical formalisms, and hence can complement classical epistasis analysis and relax its assumptions. We develop a couple of theoretical examples and analyze them from a dynamic Boolean network model perspective. Boolean networks could help to guide additional experiments and discern among alternative regulatory schemes that would be impossible or difficult to infer without the elimination of these assumption from the classical epistasis analysis. We also use examples from the literature to show how a Boolean network-based approach has resolved ambiguities and guided epistasis analysis. Our article complements previous accounts, not only by focusing on the implications of the hierarchical and

  14. Geographic Object-Based Image Analysis - Towards a new paradigm.

    Science.gov (United States)

    Blaschke, Thomas; Hay, Geoffrey J; Kelly, Maggi; Lang, Stefan; Hofmann, Peter; Addink, Elisabeth; Queiroz Feitosa, Raul; van der Meer, Freek; van der Werff, Harald; van Coillie, Frieke; Tiede, Dirk

    2014-01-01

    The amount of scientific literature on (Geographic) Object-based Image Analysis - GEOBIA has been and still is sharply increasing. These approaches to analysing imagery have antecedents in earlier research on image segmentation and use GIS-like spatial analysis within classification and feature extraction approaches. This article investigates these development and its implications and asks whether or not this is a new paradigm in remote sensing and Geographic Information Science (GIScience). We first discuss several limitations of prevailing per-pixel methods when applied to high resolution images. Then we explore the paradigm concept developed by Kuhn (1962) and discuss whether GEOBIA can be regarded as a paradigm according to this definition. We crystallize core concepts of GEOBIA, including the role of objects, of ontologies and the multiplicity of scales and we discuss how these conceptual developments support important methods in remote sensing such as change detection and accuracy assessment. The ramifications of the different theoretical foundations between the ' per-pixel paradigm ' and GEOBIA are analysed, as are some of the challenges along this path from pixels, to objects, to geo-intelligence. Based on several paradigm indications as defined by Kuhn and based on an analysis of peer-reviewed scientific literature we conclude that GEOBIA is a new and evolving paradigm.

  15. Improved nowcasting of precipitation based on convective analysis fields

    Directory of Open Access Journals (Sweden)

    T. Haiden

    2007-04-01

    Full Text Available The high-resolution analysis and nowcasting system INCA (Integrated Nowcasting through Comprehensive Analysis developed at the Austrian national weather service provides three-dimensional fields of temperature, humidity, and wind on an hourly basis, and two-dimensional fields of precipitation rate in 15 min intervals. The system operates on a horizontal resolution of 1 km and a vertical resolution of 100–200 m. It combines surface station data, remote sensing data (radar, satellite, forecast fields of the numerical weather prediction model ALADIN, and high-resolution topographic data. An important application of the INCA system is nowcasting of convective precipitation. Based on fine-scale temperature, humidity, and wind analyses a number of convective analysis fields are routinely generated. These fields include convective boundary layer (CBL flow convergence and specific humidity, lifted condensation level (LCL, convective available potential energy (CAPE, convective inhibition (CIN, and various convective stability indices. Based on the verification of areal precipitation nowcasts it is shown that the pure translational forecast of convective cells can be improved by using a decision algorithm which is based on a subset of the above fields, combined with satellite products.

  16. Management of Microbiologically Influenced Corrosion in Risk Based Inspection analysis

    DEFF Research Database (Denmark)

    Skovhus, Torben Lund; Hillier, Elizabeth; Andersen, Erlend S.

    2016-01-01

    Operating offshore oil and gas production facilities is often associated with high risk. In order to manage the risk, operators commonly use aids to support decision making in the establishment of a maintenance and inspection strategy. Risk Based Inspection (RBI) analysis is widely used in the of......Operating offshore oil and gas production facilities is often associated with high risk. In order to manage the risk, operators commonly use aids to support decision making in the establishment of a maintenance and inspection strategy. Risk Based Inspection (RBI) analysis is widely used...... in the offshore industry as a means to justify the inspection strategy adopted. The RBI analysis is a decision-making technique that enables asset managers to identify the risk related to failure of their most critical systems and components, with an effect on safety, environmental and business related issues...... and an extensive up-to date literature study. The parameters are discussed and subsequently combined in a novel procedure that allows assessment of MIC in a RBI analysis. The procedure is sub-divided into one screening step and a detailed assessment, which fits the recommended approach to assess risk in a RBI...

  17. Sensitivity analysis practices: Strategies for model-based inference

    Energy Technology Data Exchange (ETDEWEB)

    Saltelli, Andrea [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (Vatican City State, Holy See,) (Italy)]. E-mail: andrea.saltelli@jrc.it; Ratto, Marco [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Tarantola, Stefano [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy); Campolongo, Francesca [Institute for the Protection and Security of the Citizen (IPSC), European Commission, Joint Research Centre, TP 361, 21020 Ispra (VA) (Italy)

    2006-10-15

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA.

  18. Sensitivity analysis practices: Strategies for model-based inference

    International Nuclear Information System (INIS)

    Saltelli, Andrea; Ratto, Marco; Tarantola, Stefano; Campolongo, Francesca

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having 'sensitivity analysis' as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of existing guidelines for SA issued on both sides of the Atlantic, we could not find in our review other than very primitive SA tools, based on 'one-factor-at-a-time' (OAT) approaches. In the context of model corroboration or falsification, we demonstrate that this use of OAT methods is illicit and unjustified, unless the model under analysis is proved to be linear. We show that available good practices, such as variance based measures and others, are able to overcome OAT shortcomings and easy to implement. These methods also allow the concept of factors importance to be defined rigorously, thus making the factors importance ranking univocal. We analyse the requirements of SA in the context of modelling, and present best available practices on the basis of an elementary model. We also point the reader to available recipes for a rigorous SA

  19. A Gene-Based Analysis of Acoustic Startle Latency

    Science.gov (United States)

    Smith, Alicia K.; Jovanovic, Tanja; Kilaru, Varun; Lori, Adriana; Gensler, Lauren; Lee, Samuel S.; Norrholm, Seth Davin; Massa, Nicholas; Cuthbert, Bruce; Bradley, Bekh; Ressler, Kerry J.; Duncan, Erica

    2017-01-01

    Latency of the acoustic startle response is the time required from the presentation of startling auditory stimulus until the startle response is elicited and provides an index of neural processing speed. Latency is prolonged in subjects with schizophrenia compared to controls in some but not all studies and is 68–90% heritable in baseline startle trials. In order to determine the genetic association with latency as a potential inroad into genetically based vulnerability to psychosis, we conducted a gene-based study of latency followed by an independent replication study of significant gene findings with a single-nucleotide polymorphism (SNP)-based analysis of schizophrenia and control subjects. 313 subjects from an urban population of low socioeconomic status with mixed psychiatric diagnoses were included in the gene-based study. Startle testing was conducted using a Biopac M150 system according to our published methods. Genotyping was performed with the Omni-Quad 1M or the Omni Express BeadChip. The replication study was conducted on 154 schizophrenia subjects and 123 psychiatric controls. Genetic analyses were conducted with Illumina Human Omni1-Quad and OmniExpress BeadChips. Twenty-nine SNPs were selected from four genes that were significant in the gene-based analysis and also associated with startle and/or schizophrenia in the literature. Linear regressions on latency were conducted, controlling for age, race, and diagnosis as a dichotomous variable. In the gene-based study, 2,870 genes demonstrated the evidence of association after correction for multiple comparisons (false discovery rate < 0.05). Pathway analysis of these genes revealed enrichment for relevant biological processes including neural transmission (p = 0.0029), synaptic transmission (p = 0.0032), and neuronal development (p = 0.024). The subsequent SNP-based replication analysis revealed a strong association of onset latency with the SNP rs901561 on the neuregulin gene (NRG1

  20. Ontology-based specification, identification and analysis of perioperative risks.

    Science.gov (United States)

    Uciteli, Alexandr; Neumann, Juliane; Tahar, Kais; Saleh, Kutaiba; Stucke, Stephan; Faulbrück-Röhr, Sebastian; Kaeding, André; Specht, Martin; Schmidt, Tobias; Neumuth, Thomas; Besting, Andreas; Stegemann, Dominik; Portheine, Frank; Herre, Heinrich

    2017-09-06

    Medical personnel in hospitals often works under great physical and mental strain. In medical decision-making, errors can never be completely ruled out. Several studies have shown that between 50 and 60% of adverse events could have been avoided through better organization, more attention or more effective security procedures. Critical situations especially arise during interdisciplinary collaboration and the use of complex medical technology, for example during surgical interventions and in perioperative settings (the period of time before, during and after surgical intervention). In this paper, we present an ontology and an ontology-based software system, which can identify risks across medical processes and supports the avoidance of errors in particular in the perioperative setting. We developed a practicable definition of the risk notion, which is easily understandable by the medical staff and is usable for the software tools. Based on this definition, we developed a Risk Identification Ontology (RIO) and used it for the specification and the identification of perioperative risks. An agent system was developed, which gathers risk-relevant data during the whole perioperative treatment process from various sources and provides it for risk identification and analysis in a centralized fashion. The results of such an analysis are provided to the medical personnel in form of context-sensitive hints and alerts. For the identification of the ontologically specified risks, we developed an ontology-based software module, called Ontology-based Risk Detector (OntoRiDe). About 20 risks relating to cochlear implantation (CI) have already been implemented. Comprehensive testing has indicated the correctness of the data acquisition, risk identification and analysis components, as well as the web-based visualization of results.

  1. Performance of Koyna dam based on static and dynamic analysis

    Science.gov (United States)

    Azizan, Nik Zainab Nik; Majid, Taksiah A.; Nazri, Fadzli Mohamed; Maity, Damodar

    2017-10-01

    This paper discusses the performance of Koyna dam based on static pushover analysis (SPO) and incremental dynamic analysis (IDA). The SPO in this study considered two type of lateral load which is inertial load and hydrodynamic load. The structure was analyse until the damage appears on the structure body. The IDA curves were develop based on 7 ground motion, where the characteristic of the ground motions: i) the distance from the epicenter is less than 15km, (ii) the magnitude is equal to or greater than 5.5 and (iii) the PGA is equal to or greater than 0.15g. All the ground motions convert to respond spectrum and scaled according to the developed elastic respond spectrum in order to match the characteristic of the ground motion to the soil type. Elastic respond spectrum developed based on soil type B by using Eurocode 8. By using SPO and IDA method are able to determine the limit states of the dam. The limit state proposed in this study are yielding and ultimate state which is identified base on crack pattern perform on the structure model. The comparison of maximum crest displacement for both methods is analysed to define the limit state of the dam. The displacement of yielding state for Koyna dam is 23.84mm and 44.91mm for the ultimate state. The results are able to be used as a guideline to monitor Koyna dam under seismic loadings which are considering static and dynamic.

  2. Phosphoproteomics-based systems analysis of signal transduction networks

    Directory of Open Access Journals (Sweden)

    Hiroko eKozuka-Hata

    2012-01-01

    Full Text Available Signal transduction systems coordinate complex cellular information to regulate biological events such as cell proliferation and differentiation. Although the accumulating evidence on widespread association of signaling molecules has revealed essential contribution of phosphorylation-dependent interaction networks to cellular regulation, their dynamic behavior is mostly yet to be analyzed. Recent technological advances regarding mass spectrometry-based quantitative proteomics have enabled us to describe the comprehensive status of phosphorylated molecules in a time-resolved manner. Computational analyses based on the phosphoproteome dynamics accelerate generation of novel methodologies for mathematical analysis of cellular signaling. Phosphoproteomics-based numerical modeling can be used to evaluate regulatory network elements from a statistical point of view. Integration with transcriptome dynamics also uncovers regulatory hubs at the transcriptional level. These omics-based computational methodologies, which have firstly been applied to representative signaling systems such as the epidermal growth factor receptor pathway, have now opened up a gate for systems analysis of signaling networks involved in immune response and cancer.

  3. Analysis of Hylocereus spp. diversity based on phenetic method

    Science.gov (United States)

    Hamidah, Tsawab, Husnus; Rosmanida

    2017-06-01

    This study was aimed to determine number of distinguishing characters; the most dominant characters on dragonfruit (Hylocereus) classification; and dragonfruit classification relationship based on their morphological characters. Sampling was performed in Bhakti Alam Agrotourism, Pasuruan. Amount of observed parameters were 63 characters, including stem/branches segments, areolas, flower, fruit, and seeds characters. These characters were analyzed using descriptive and phenetic methods. Based on descriptive result, there were 59 distinguishing characters that affected classification of five dragonfruit species. They were white dragonfruit, pink dragonfruit, red dragonfruit, purplish-red dragonfruit, and yellow dragonfruit. Based on phenetic analysis, it was obtained a dendogram which showed the relationship of dragonfruit classification. Purplish-red and red dragonfruit were closely related with 50.7% in similarity value, which then these groups were referred as group VI. Pink dragonfruit and group VI were closely related with 43.3% in similarity value, which then these groups were referred as group IV. White dragonfruit and group IV were closely related with 21.5% in similarity value, which then these groups were referred to as group II. Meanwhile, yellow dragonfruit and group II were closely related with 8.5% in similarity value. Based on principal component analysis, there were 34 characters which influenced strongly dragonfruit classification. Two of them were the most dominant character that affected dragonfruit classification. They were curvature stem and number of fruit bracteola remnants, with component value 0,955.

  4. Web-Based Virtual Laboratory for Food Analysis Course

    Science.gov (United States)

    Handayani, M. N.; Khoerunnisa, I.; Sugiarti, Y.

    2018-02-01

    Implementation of learning on food analysis course in Program Study of Agro-industrial Technology Education faced problems. These problems include the availability of space and tools in the laboratory that is not comparable with the number of students also lack of interactive learning tools. On the other hand, the information technology literacy of students is quite high as well the internet network is quite easily accessible on campus. This is a challenge as well as opportunities in the development of learning media that can help optimize learning in the laboratory. This study aims to develop web-based virtual laboratory as one of the alternative learning media in food analysis course. This research is R & D (research and development) which refers to Borg & Gall model. The results showed that assessment’s expert of web-based virtual labs developed, in terms of software engineering aspects; visual communication; material relevance; usefulness and language used, is feasible as learning media. The results of the scaled test and wide-scale test show that students strongly agree with the development of web based virtual laboratory. The response of student to this virtual laboratory was positive. Suggestions from students provided further opportunities for improvement web based virtual laboratory and should be considered for further research.

  5. Climate policy decisions require policy-based lifecycle analysis.

    Science.gov (United States)

    Bento, Antonio M; Klotz, Richard

    2014-05-20

    Lifecycle analysis (LCA) metrics of greenhouse gas emissions are increasingly being used to select technologies supported by climate policy. However, LCAs typically evaluate the emissions associated with a technology or product, not the impacts of policies. Here, we show that policies supporting the same technology can lead to dramatically different emissions impacts per unit of technology added, due to multimarket responses to the policy. Using a policy-based consequential LCA, we find that the lifecycle emissions impacts of four US biofuel policies range from a reduction of 16.1 gCO2e to an increase of 24.0 gCO2e per MJ corn ethanol added by the policy. The differences between these results and representative technology-based LCA measures, which do not account for the policy instrument driving the expansion in the technology, illustrate the need for policy-based LCA measures when informing policy decision making.

  6. Graph- versus Vector-Based Analysis of a Consensus Protocol

    Directory of Open Access Journals (Sweden)

    Giorgio Delzanno

    2014-07-01

    Full Text Available The Paxos distributed consensus algorithm is a challenging case-study for standard, vector-based model checking techniques. Due to asynchronous communication, exhaustive analysis may generate very large state spaces already for small model instances. In this paper, we show the advantages of graph transformation as an alternative modelling technique. We model Paxos in a rich declarative transformation language, featuring (among other things nested quantifiers, and we validate our model using the GROOVE model checker, a graph-based tool that exploits isomorphism as a natural way to prune the state space via symmetry reductions. We compare the results with those obtained by the standard model checker Spin on the basis of a vector-based encoding of the algorithm.

  7. Economical analysis based on a simplified model of micro distillery

    International Nuclear Information System (INIS)

    Bristoti, A.; Adams, R.

    1987-01-01

    The investment costs of the hydrate alcohol distillery as well as the energy balance of its production made inviable its diffusion on a small scale. The present economical analysis is based on a simplified model of micro distillery where the reduction on investment costs in based on technical data based on the possibility of the utilization of an hydrate alcohol with a higher water constant than the usual, i. e. around 85 0 GL. The engineering project of this plant eliminates all pumps, all the liquids (water, sugar cane, syrup and wine) are fed by gravity. The bagasse is considered a noble by-product and it is utilized in poultry industry as a substitute of wood dust. All the heat needs of the micro distillery are supplied by wood. (author)

  8. Adaptive Human aware Navigation based on Motion Pattern Analysis

    DEFF Research Database (Denmark)

    Tranberg, Søren; Svenstrup, Mikael; Andersen, Hans Jørgen

    2009-01-01

    Respecting people’s social spaces is an important prerequisite for acceptable and natural robot navigation in human environments. In this paper, we describe an adaptive system for mobile robot navigation based on estimates of whether a person seeks to interact with the robot or not. The estimates...... are based on run-time motion pattern analysis compared to stored experience in a database. Using a potential field centered around the person, the robot positions itself at the most appropriate place relative to the person and the interaction status. The system is validated through qualitative tests...... in a real world setting. The results demonstrate that the system is able to learn to navigate based on past interaction experiences, and to adapt to different behaviors over time....

  9. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  10. Validation Database Based Thermal Analysis of an Advanced RPS Concept

    Science.gov (United States)

    Balint, Tibor S.; Emis, Nickolas D.

    2006-01-01

    Advanced RPS concepts can be conceived, designed and assessed using high-end computational analysis tools. These predictions may provide an initial insight into the potential performance of these models, but verification and validation are necessary and required steps to gain confidence in the numerical analysis results. This paper discusses the findings from a numerical validation exercise for a small advanced RPS concept, based on a thermal analysis methodology developed at JPL and on a validation database obtained from experiments performed at Oregon State University. Both the numerical and experimental configurations utilized a single GPHS module enabled design, resembling a Mod-RTG concept. The analysis focused on operating and environmental conditions during the storage phase only. This validation exercise helped to refine key thermal analysis and modeling parameters, such as heat transfer coefficients, and conductivity and radiation heat transfer values. Improved understanding of the Mod-RTG concept through validation of the thermal model allows for future improvements to this power system concept.

  11. Emergy Analysis and Sustainability Efficiency Analysis of Different Crop-Based Biodiesel in Life Cycle Perspective

    DEFF Research Database (Denmark)

    Ren, Jingzheng; Manzardo, Alessandro; Mazzi, Anna

    2013-01-01

    Biodiesel as a promising alternative energy resource has been a hot spot in chemical engineering nowadays, but there is also an argument about the sustainability of biodiesel. In order to analyze the sustainability of biodiesel production systems and select the most sustainable scenario, various...... kinds of crop-based biodiesel including soybean-, rapeseed-, sunflower-, jatropha- and palm-based biodiesel production options are studied by emergy analysis; soybean-based scenario is recognized as the most sustainable scenario that should be chosen for further study in China. DEA method is used...... to evaluate the sustainability efficiencies of these options, and the biodiesel production systems based on soybean, sunflower, and palm are considered as DEA efficient, whereas rapeseed-based and jatropha-based scenarios are needed to be improved, and the improved methods have also been specified....

  12. A linear mixture analysis-based compression for hyperspectral image analysis

    Energy Technology Data Exchange (ETDEWEB)

    C. I. Chang; I. W. Ginsberg

    2000-06-30

    In this paper, the authors present a fully constrained least squares linear spectral mixture analysis-based compression technique for hyperspectral image analysis, particularly, target detection and classification. Unlike most compression techniques that directly deal with image gray levels, the proposed compression approach generates the abundance fractional images of potential targets present in an image scene and then encodes these fractional images so as to achieve data compression. Since the vital information used for image analysis is generally preserved and retained in the abundance fractional images, the loss of information may have very little impact on image analysis. In some occasions, it even improves analysis performance. Airborne visible infrared imaging spectrometer (AVIRIS) data experiments demonstrate that it can effectively detect and classify targets while achieving very high compression ratios.

  13. Signal/noise analysis of FRET-based sensors.

    Science.gov (United States)

    Woehler, Andrew; Wlodarczyk, Jakub; Neher, Erwin

    2010-10-06

    Molecular sensors based on intramolecular Förster resonance energy transfer (FRET) have become versatile tools to monitor regulatory molecules in living tissue. However, their use is often compromised by low signal strength and excessive noise. We analyzed signal/noise (SNR) aspects of spectral FRET analysis methods, with the following conclusions: The most commonly used method (measurement of the emission ratio after a single short wavelength excitation) is optimal in terms of signal/noise, if only relative changes of this uncalibrated ratio are of interest. In the case that quantitative data on FRET efficiencies are required, these can be calculated from the emission ratio and some calibration parameters, but at reduced SNR. Lux-FRET, a recently described method for spectral analysis of FRET data, allows one to do so in three different ways, each based on a ratio of two out of three measured fluorescence signals (the donor and acceptor signal during a short-wavelength excitation and the acceptor signal during long wavelength excitation). Lux-FRET also allows for calculation of the total abundance of donor and acceptor fluorophores. The SNR for all these quantities is lower than that of the plain emission ratio due to unfavorable error propagation. However, if ligand concentration is calculated either from lux-FRET values or else, after its calibration, from the emission ratio, SNR for both analysis modes is very similar. Likewise, SNR values are similar, if the noise of these quantities is related to the expected dynamic range. We demonstrate these relationships based on data from an Epac-based cAMP sensor and discuss how the SNR changes with the FRET efficiency and the number of photons collected. Copyright © 2010 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  14. Model-Based Evaluation Of System Scalability: Bandwidth Analysis For Smartphone-Based Biosensing Applications

    DEFF Research Database (Denmark)

    Patou, François; Madsen, Jan; Dimaki, Maria

    2016-01-01

    -engineering efforts for scaling a system specification efficaciously. We demonstrate the value of our methodology by investigating a smartphone-based biosensing instrumentation platform. Specifically, we carry out scalability analysis for the system’s bandwidth specification: the maximum analog voltage waveform...

  15. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  16. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    Science.gov (United States)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  17. Complexity analysis based on generalized deviation for financial markets

    Science.gov (United States)

    Li, Chao; Shang, Pengjian

    2018-03-01

    In this paper, a new modified method is proposed as a measure to investigate the correlation between past price and future volatility for financial time series, known as the complexity analysis based on generalized deviation. In comparison with the former retarded volatility model, the new approach is both simple and computationally efficient. The method based on the generalized deviation function presents us an exhaustive way showing the quantization of the financial market rules. Robustness of this method is verified by numerical experiments with both artificial and financial time series. Results show that the generalized deviation complexity analysis method not only identifies the volatility of financial time series, but provides a comprehensive way distinguishing the different characteristics between stock indices and individual stocks. Exponential functions can be used to successfully fit the volatility curves and quantify the changes of complexity for stock market data. Then we study the influence for negative domain of deviation coefficient and differences during the volatile periods and calm periods. after the data analysis of the experimental model, we found that the generalized deviation model has definite advantages in exploring the relationship between the historical returns and future volatility.

  18. Resilience Analysis of Countries under Disasters Based on Multisource Data.

    Science.gov (United States)

    Zhang, Nan; Huang, Hong

    2018-01-01

    Disasters occur almost daily in the world. Because emergencies frequently have no precedent, are highly uncertain, and can be very destructive, improving a country's resilience is an efficient way to reduce risk. In this article, we collected more than 20,000 historical data points from disasters from 207 countries to enable us to calculate the severity of disasters and the danger they pose to countries. In addition, 6 primary indices (disaster, personal attribute, infrastructure, economics, education, and occupation) including 38 secondary influencing factors are considered in analyzing the resilience of countries. Using these data, we obtained the danger, expected number of deaths, and resilience of all 207 countries. We found that a country covering a large area is more likely to have a low resilience score. Through sensitivity analysis of all secondary indices, we found that population density, frequency of disasters, and GDP are the three most critical factors affecting resilience. Based on broad-spectrum resilience analysis of the different continents, Oceania and South America have the highest resilience, while Asia has the lowest. Over the past 50 years, the resilience of many countries has been improved sharply, especially in developing countries. Based on our results, we analyze the comprehensive resilience and provide some optimal suggestions to efficiently improve resilience. © 2017 Society for Risk Analysis.

  19. Techno-Economic Analysis of Biofuels Production Based on Gasification

    Energy Technology Data Exchange (ETDEWEB)

    Swanson, R. M.; Platon, A.; Satrio, J. A.; Brown, R. C.; Hsu, D. D.

    2010-11-01

    This study compares capital and production costs of two biomass-to-liquid production plants based on gasification. The first biorefinery scenario is an oxygen-fed, low-temperature (870?C), non-slagging, fluidized bed gasifier. The second scenario is an oxygen-fed, high-temperature (1,300?C), slagging, entrained flow gasifier. Both are followed by catalytic Fischer-Tropsch synthesis and hydroprocessing to naphtha-range (gasoline blend stock) and distillate-range (diesel blend stock) liquid fractions. Process modeling software (Aspen Plus) is utilized to organize the mass and energy streams and cost estimation software is used to generate equipment costs. Economic analysis is performed to estimate the capital investment and operating costs. Results show that the total capital investment required for nth plant scenarios is $610 million and $500 million for high-temperature and low-temperature scenarios, respectively. Product value (PV) for the high-temperature and low-temperature scenarios is estimated to be $4.30 and $4.80 per gallon of gasoline equivalent (GGE), respectively, based on a feedstock cost of $75 per dry short ton. Sensitivity analysis is also performed on process and economic parameters. This analysis shows that total capital investment and feedstock cost are among the most influential parameters affecting the PV.

  20. Medical diagnostics by laser-based analysis of exhaled breath

    Science.gov (United States)

    Giubileo, Gianfranco

    2002-08-01

    IMany trace gases can be found in the exhaled breath, some of them giving the possibility of a non invasive diagnosis of related diseases or allowing the monitoring of the disease in the course of its therapy. In the present lecture the principle of medical diagnosis based on the breath analysis will be introduced and the detection of trace gases in exhaled breath by high- resolution molecular spectroscopy in the IR spectral region will be discussed. A number of substrates and the optical systems for their laser detection will be reported. The following laser based experimental systems has been realised in the Molecular Spectroscopy Laboratory in ENEA in Frascati for the analysis of specific substances in the exhaled breath. A tuneable diode laser absorption spectroscopy (TDLAS) appartus for the measurement of 13C/12C isotopic ratio in carbon dioxide, a TDLAS apparatus for the detection of CH4 and a CO2 laser based photoacoustic system to detect trace ethylene at atmospheric pressure. The experimental set-up for each one of the a.m. optical systems will be shown and the related medical applications will be illustrated. The concluding remarks will be focuses on chemical species that are of major interest for medical people today and their diagnostic ability.

  1. Experimental Bifurcation Analysis Using Control-Based Continuation

    DEFF Research Database (Denmark)

    Bureau, Emil; Starke, Jens

    The focus of this thesis is developing and implementing techniques for performing experimental bifurcation analysis on nonlinear mechanical systems. The research centers around the newly developed control-based continuation method, which allows to systematically track branches of stable and unsta......The focus of this thesis is developing and implementing techniques for performing experimental bifurcation analysis on nonlinear mechanical systems. The research centers around the newly developed control-based continuation method, which allows to systematically track branches of stable...... procedure is applied to our test rig, resulting in a reliable non-invasive, locally stabilizing control. The use of stabilizing control makes it difficult to determine the stability of the underlying uncontrolled equilibrium. Based on the idea of momentarily modifying or disabling the control and study...... the resulting behavior, we propose and test three different methods for assessing stability of equilibrium states during experimental continuation. We show that it is possible to determine the stability without allowing unbounded divergence, and that it is under certain circumstances possible to quantify...

  2. [Detecting cardiac arrhythmias based on phase space analysis].

    Science.gov (United States)

    Sun, Rongrong; Wang, Yuanyuan; Yang, Su; Fang, Zuxiang

    2008-08-01

    It is important for cardiac therapy devices such as the automated external defibrillator to discriminate different cardiac disorders based on Electrocardiogram analysis. A phase space analysis based algorithm is proposed to detect cardiac arrhythmias effectively. Firstly, the phase space of the signal is reconstructed. Then from the viewpoint of geometry and information theory, the distribution entropy of the point density in the two-dimensional reconstructed phase space is calculated as the features in the further classification. Finally the nearest-neighbour method based on Mahalanobis distance is used to classify the sinus rhythm (SR), supraventricular tachyarrhythmia (SVTA), atrial flutter (AFL) and atrial fibrillation (AF). To evaluate the sensitivity, specificity and accuracy of this proposed method in the cardiac arrhythmias classification, the MIT-BIH arrhythmias database and the canine endocardial database are studied respectively. Experiment results demonstrate that the proposed method can detect SR, SVTA, AFL and AF signals rapidly and accurately with the simple computation. It promises to find application in automated devices for cardiac arrhythmias therapy.

  3. ALGORITHMS FOR TENNIS RACKET ANALYSIS BASED ON MOTION DATA

    Directory of Open Access Journals (Sweden)

    Maria Skublewska-Paszkowska

    2016-09-01

    Full Text Available Modern technologies, such as motion capture systems (both optical and markerless, are more and more frequently used for athlete performance analysis due to their great precision. Optical systems based on the retro-reflective markers allow for tracking motion of multiple objects of various types. These systems compute human kinetic and kinematic parameters based on biomechanical models. Tracking additional objects like a tennis racket is also a very important aspect for analysing the player’s technique and precision. The motion data gathered by motion capture systems may be used for analysing various aspects that may not be recognised by the human eye or a video camera. This paper presents algorithms for analysis of a tennis racket motion during two of the most important tennis strokes: forehand and backhand. An optical Vicon system was used for obtaining the motion data which was the input for the algorithms. They indicate: the velocity of a tennis racket’s head and the racket’s handle based on the trajectories of attached markers as well as the racket’s orientation. The algorithms were implemented and tested on the data obtained from a professional trainer who participated in the research and performed a series of ten strikes, separately for: 1 forehand without a ball, 2 backhand without a ball, 3 forehand with a ball and 4 backhand with a ball. The computed parameters are gathered in tables and visualised in a graph.

  4. Finite element analysis of osteoporosis models based on synchrotron radiation

    Science.gov (United States)

    Xu, W.; Xu, J.; Zhao, J.; Sun, J.

    2016-04-01

    With growing pressure of social aging, China has to face the increasing population of osteoporosis patients as well as the whole world. Recently synchrotron radiation has become an essential tool for biomedical exploration with advantage of high resolution and high stability. In order to study characteristic changes in different stages of primary osteoporosis, this research focused on the different periods of osteoporosis of rats based on synchrotron radiation. Both bone histomorphometry analysis and finite element analysis were then carried on according to the reconstructed three dimensional models. Finally, the changes of bone tissue in different periods were compared quantitatively. Histomorphometry analysis showed that the structure of the trabecular in osteoporosis degraded as the bone volume decreased. For femurs, the bone volume fraction (Bone volume/ Total volume, BV/TV) decreased from 69% to 43%. That led to the increase of the thickness of trabecular separation (from 45.05μ m to 97.09μ m) and the reduction of the number of trabecular (from 7.99 mm-1 to 5.97mm-1). Simulation of various mechanical tests with finite element analysis (FEA) indicated that, with the exacerbation of osteoporosis, the bones' ability of resistance to compression, bending and torsion gradually became weaker. The compression stiffness of femurs decreased from 1770.96 Fμ m-1 to 697.41 Fμ m-1, the bending and torsion stiffness were from 1390.80 Fμ m-1 to 566.11 Fμ m-1 and from 2957.28N.m/o to 691.31 N.m/o respectively, indicated the decrease of bone strength, and it matched the histomorphometry analysis. This study suggested that FEA and synchrotron radiation were excellent methods for analysing bone strength conbined with histomorphometry analysis.

  5. Statistical mechanics of light elements at high pressure. IV - A model free energy for the metallic phase. [for Jovian type planet interiors

    Science.gov (United States)

    Dewitt, H. E.; Hubbard, W. B.

    1976-01-01

    A large quantity of data on the thermodynamic properties of hydrogen-helium metallic liquids have been obtained in extended computer calculations in which a Monte Carlo code essentially identical to that described by Hubbard (1972) was used. A model free energy for metallic hydrogen with a relatively small mass fraction of helium is discussed, taking into account the definition of variables, a procedure for choosing the free energy, values for the fitting parameters, and the evaluation of the entropy constants. Possibilities concerning a use of the obtained data in studies of the interiors of the outer planets are briefly considered.

  6. Data-Driven Control for Interlinked AC/DC Microgrids via Model-Free Adaptive Control and Dual-Droop Control

    DEFF Research Database (Denmark)

    Zhang, Huaguang; Zhou, Jianguo; Sun, Qiuye

    2017-01-01

    This paper investigates the coordinated power sharing issues of interlinked ac/dc microgrids. An appropriate control strategy is developed to control the interlinking converter (IC) to realize proportional power sharing between ac and dc microgrids. The proposed strategy mainly includes two parts......: the primary outer-loop dual-droop control method along with secondary control; the inner-loop data-driven model-free adaptive voltage control. Using the proposed scheme, the interlinking converter, just like the hierarchical controlled DG units, will have the ability to regulate and restore the dc terminal...

  7. ICWorld: An MMOG-Based Approach to Analysis

    Directory of Open Access Journals (Sweden)

    Wyatt Wong

    2008-01-01

    Full Text Available Intelligence analysts routinely work with "wicked" problems—critical,time-sensitive problems where analytical errors can lead to catastrophic consequences for the nation's security. In the analyst's world, important decisions are often made quickly, and are made based on consuming, understanding, and piecing together enormous volumes of data. The data is not only voluminous, but often fragmented, subjective, inaccurate and fluid.Why does multi-player on-line gaming (MMOG technology matter to the IC? Fundamentally, there are two reasons. The first is technological: stripping away the gamelike content, MMOGs are dynamic systems that represent a physical world, where users are presented with (virtual life-and-death challenges that can only be overcome through planning, collaboration and communication. The second is cultural: the emerging generation of analysts is part of what is sometimes called the "Digital Natives" (Prensky 2001 and is fluent with interactive media. MMOGs enable faster visualization, data manipulation, collaboration and analysis than traditional text and imagery.ICWorld is an MMOG approach to intelligence analysis that fuses ideasfrom experts in the fields of gaming and data visualization, with knowledge of current and future intelligence analysis processes and tools. The concept has evolved over the last year as a result of evaluations by allsource analysts from around the IC. When fully developed, the Forterra team believes that ICWorld will fundamentally address major shortcomings of intelligence analysis, and dramatically improve the effectiveness of intelligence products.

  8. Beyond the GUM: variance-based sensitivity analysis in metrology

    International Nuclear Information System (INIS)

    Lira, I

    2016-01-01

    Variance-based sensitivity analysis is a well established tool for evaluating the contribution of the uncertainties in the inputs to the uncertainty in the output of a general mathematical model. While the literature on this subject is quite extensive, it has not found widespread use in metrological applications. In this article we present a succinct review of the fundamentals of sensitivity analysis, in a form that should be useful to most people familiarized with the Guide to the Expression of Uncertainty in Measurement (GUM). Through two examples, it is shown that in linear measurement models, no new knowledge is gained by using sensitivity analysis that is not already available after the terms in the so-called ‘law of propagation of uncertainties’ have been computed. However, if the model behaves non-linearly in the neighbourhood of the best estimates of the input quantities—and if these quantities are assumed to be statistically independent—sensitivity analysis is definitely advantageous for gaining insight into how they can be ranked according to their importance in establishing the uncertainty of the measurand. (paper)

  9. Psychoacoustic Music Analysis Based on the Discrete Wavelet Packet Transform

    Directory of Open Access Journals (Sweden)

    Xing He

    2008-01-01

    Full Text Available Psychoacoustical computational models are necessary for the perceptual processing of acoustic signals and have contributed significantly in the development of highly efficient audio analysis and coding. In this paper, we present an approach for the psychoacoustic analysis of musical signals based on the discrete wavelet packet transform. The proposed method mimics the multiresolution properties of the human ear closer than other techniques and it includes simultaneous and temporal auditory masking. Experimental results show that this method provides better masking capabilities and it reduces the signal-to-masking ratio substantially more than other approaches, without introducing audible distortion. This model can lead to greater audio compression by permitting further bit rate reduction and more secure watermarking by providing greater signal space for information hiding.

  10. Numerical analysis of a polysilicon-based resistive memory device

    KAUST Repository

    Berco, Dan

    2018-03-08

    This study investigates a conductive bridge resistive memory device based on a Cu top electrode, 10-nm polysilicon resistive switching layer and a TiN bottom electrode, by numerical analysis for $$10^{3}$$103 programming and erase simulation cycles. The low and high resistive state values in each cycle are calculated, and the analysis shows that the structure has excellent retention reliability properties. The presented Cu species density plot indicates that Cu insertion occurs almost exclusively along grain boundaries resulting in a confined isomorphic conductive filament that maintains its overall shape and electric properties during cycling. The superior reliability of this structure may thus be attributed to the relatively low amount of Cu migrating into the RSL during initial formation. In addition, the results show a good match and help to confirm experimental measurements done over a previously demonstrated device.

  11. GIS-based poverty and population distribution analysis in China

    Science.gov (United States)

    Cui, Jing; Wang, Yingjie; Yan, Hong

    2009-07-01

    Geographically, poverty status is not only related with social-economic factors but also strongly affected by geographical environment. In the paper, GIS-based poverty and population distribution analysis method is introduced for revealing their regional differences. More than 100000 poor villages and 592 national key poor counties are chosen for the analysis. The results show that poverty distribution tends to concentrate in most of west China and mountainous rural areas of mid China. Furthermore, the fifth census data are overlaid to those poor areas in order to gain its internal diversity of social-economic characteristics. By overlaying poverty related social-economic parameters, such as sex ratio, illiteracy, education level, percentage of ethnic minorities, family composition, finding shows that poverty distribution is strongly correlated with high illiteracy rate, high percentage minorities, and larger family member.

  12. Seismic Base Isolation Analysis for PASCAR Liquid Metal Reactor

    International Nuclear Information System (INIS)

    Lee, Kuk Hee; Yoo, Bong; Kim, Yun Jae

    2008-01-01

    This paper presents a study for developing a seismic isolation system for the PASCAR (Proliferation resistant, Accident-tolerant, Self-supported, Capsular and Assured Reactor) liquid metal reactor design. PASCAR use lead-bismuth eutectic (LBE) as coolant. Because the density (10,000kg/m 3 ) of LBE coolant is very heavier than sodium coolant and water, this presents a challenge to designers of the seismic isolation systems that will be used with these heavy liquid metal reactors. Finite element analysis is adapted to determine the characteristics of the isolator device. Results are presented from a study on the use of three-dimensional seismic isolation devices to the full-scale reactor. The seismic analysis responses of the two-dimensional and the three-dimensional isolation systems for the PASCAR are compared with that of the conventional fixed base system

  13. Real Time Engineering Analysis Based on a Generative Component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....

  14. Model-based human reliability analysis: prospects and requirements

    International Nuclear Information System (INIS)

    Mosleh, A.; Chang, Y.H.

    2004-01-01

    Major limitations of the conventional methods for human reliability analysis (HRA), particularly those developed for operator response analysis in probabilistic safety assessments (PSA) of nuclear power plants, are summarized as a motivation for the need and a basis for developing requirements for the next generation HRA methods. It is argued that a model-based approach that provides explicit cognitive causal links between operator behaviors and directly or indirectly measurable causal factors should be at the core of the advanced methods. An example of such causal model is briefly reviewed, where due to the model complexity and input requirements can only be currently implemented in a dynamic PSA environment. The computer simulation code developed for this purpose is also described briefly, together with current limitations in the models, data, and the computer implementation

  15. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  16. Real Time Engineering Analysis Based on a Generative component Implementation

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Klitgaard, Jens

    2007-01-01

    The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... the geometry, material properties and fixed point characteristics to calculate the dimensions and subsequent feasibility of any architectural design. The proposed conceptual design tool provides the possibility for the architect to work with both the aesthetic as well as the structural aspects of architecture...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....

  17. Ground extraction from airborne laser data based on wavelet analysis

    Science.gov (United States)

    Xu, Liang; Yang, Yan; Jiang, Bowen; Li, Jia

    2007-11-01

    With the advantages of high resolution and accuracy, airborne laser scanning data are widely used in topographic mapping. In order to generate a DTM, measurements from object features such as buildings, vehicles and vegetation have to be classified and removed. However, the automatic extraction of bare earth from point clouds acquired by airborne laser scanning equipment remains a problem in LIDAR data filtering nowadays. In this paper, a filter algorithm based on wavelet analysis is proposed. Relying on the capability of detecting discontinuities of continuous wavelet transform and the feature of multi-resolution analysis, the object points can be removed, while ground data are preserved. In order to evaluate the performance of this approach, we applied it to the data set used in the ISPRS filter test in 2003. 15 samples have been tested by the proposed approach. Results showed that it filtered most of the objects like vegetation and buildings, and extracted a well defined ground model.

  18. Analysis of reactor power oscillation based on nonlinear dynamic theory

    International Nuclear Information System (INIS)

    Suzudo, Tomoaki

    1994-07-01

    Reactor power oscillations are discussed based on nonlinear dynamic theory with reference to stability problem of boiling water reactors (BWRs). The reactor noise from an actual plant is, firstly, analyzed by a method originally used for the analysis of chaotic phenomenon. The results show that this method gives better dynamic descriptor of oscillatory motion than those from previous methods, and that it is applicable to real-time monitoring system of the reactor core. Next, the low-dimensional phenomenological model of BWR power oscillation is analytically studied using bifurcation theory, a branch of nonlinear dynamic theory. From this analysis are derived explicit expressions for the steady state's linear stability and weak stability not given by numerical analyses, and the qualitative properties of the power oscillation can be better understood. (author)

  19. Missile placement analysis based on improved SURF feature matching algorithm

    Science.gov (United States)

    Yang, Kaida; Zhao, Wenjie; Li, Dejun; Gong, Xiran; Sheng, Qian

    2015-03-01

    The precious battle damage assessment by use of video images to analysis missile placement is a new study area. The article proposed an improved speeded up robust features algorithm named restricted speeded up robust features, which combined the combat application of TV-command-guided missiles and the characteristics of video image. Its restrictions mainly reflected in two aspects, one is to restrict extraction area of feature point; the second is to restrict the number of feature points. The process of missile placement analysis based on video image was designed and a video splicing process and random sample consensus purification were achieved. The RSURF algorithm is proved that has good realtime performance on the basis of guarantee the accuracy.

  20. Histogram analysis for smartphone-based rapid hematocrit determination

    Science.gov (United States)

    Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.

    2017-01-01

    A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569

  1. Cloud-based Jupyter Notebooks for Water Data Analysis

    Science.gov (United States)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  2. IMMAN: free software for information theory-based chemometric analysis.

    Science.gov (United States)

    Urias, Ricardo W Pino; Barigye, Stephen J; Marrero-Ponce, Yovani; García-Jacas, César R; Valdes-Martiní, José R; Perez-Gimenez, Facundo

    2015-05-01

    The features and theoretical background of a new and free computational program for chemometric analysis denominated IMMAN (acronym for Information theory-based CheMoMetrics ANalysis) are presented. This is multi-platform software developed in the Java programming language, designed with a remarkably user-friendly graphical interface for the computation of a collection of information-theoretic functions adapted for rank-based unsupervised and supervised feature selection tasks. A total of 20 feature selection parameters are presented, with the unsupervised and supervised frameworks represented by 10 approaches in each case. Several information-theoretic parameters traditionally used as molecular descriptors (MDs) are adapted for use as unsupervised rank-based feature selection methods. On the other hand, a generalization scheme for the previously defined differential Shannon's entropy is discussed, as well as the introduction of Jeffreys information measure for supervised feature selection. Moreover, well-known information-theoretic feature selection parameters, such as information gain, gain ratio, and symmetrical uncertainty are incorporated to the IMMAN software ( http://mobiosd-hub.com/imman-soft/ ), following an equal-interval discretization approach. IMMAN offers data pre-processing functionalities, such as missing values processing, dataset partitioning, and browsing. Moreover, single parameter or ensemble (multi-criteria) ranking options are provided. Consequently, this software is suitable for tasks like dimensionality reduction, feature ranking, as well as comparative diversity analysis of data matrices. Simple examples of applications performed with this program are presented. A comparative study between IMMAN and WEKA feature selection tools using the Arcene dataset was performed, demonstrating similar behavior. In addition, it is revealed that the use of IMMAN unsupervised feature selection methods improves the performance of both IMMAN and WEKA

  3. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  4. Chest wall Ewing sarcoma: a population-based analysis.

    Science.gov (United States)

    Jacobs, Andrew J; Fishbein, Joanna; Levy, Carolyn Fein; Glick, Richard D

    2016-08-01

    The globally low incidence of pediatric chest wall Ewing sarcoma (CWES) has limited prior studies of this disease to mostly small, single-institution reviews. Our objective was to assess incidence, demographics, treatment patterns, and long-term survival of this disease through a population-based analysis. The Surveillance, Epidemiology, and End Results database was used to identify patients aged 0-21 y diagnosed with CWES from 1973 to 2011. Patients were grouped by decade to assess changes in treatment patterns and outcomes. The effects of clinical, demographic, and treatment variables on overall survival (OS) were assessed by the computation of Kaplan-Meier curves and the log-rank test, with Cox proportional hazard regression used for multivariable analysis. A total of 193 pediatric patients with histologically confirmed CWES were identified. The disease was more common in men (61%), whites (92%), and 11- to 17-y olds (49%). It was metastatic at presentation in 37% of patients. When grouped approximately by decade, 10-y OS improved progressively from 38% in 1973-1979 to 65% in 2000-2011 (P = 0.033). The use of radiation decreased from 84% in the earliest period to 40% in the most recent, whereas the proportion of patients receiving surgery increased from 75% to 85%. When controlling for covariates in multivariable analysis, male patients were found to have a higher mortality than female patients (hazard ratio: 2.4; confidence interval: 1.4, 4.4; P = 0.0028). This population-based analysis of CWES demonstrated an impressive trend of improving OS, with increasing use of surgery and decreasing use of radiation therapy. Our study demonstrated a gender difference in survival of CWES, with females having a better prognosis. The presence of metastatic disease is a very important prognostic factor for this illness. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Regional Analysis of Remote Sensing Based Evapotranspiration Information

    Science.gov (United States)

    Geli, H. M. E.; Hain, C.; Anderson, M. C.; Senay, G. B.

    2017-12-01

    Recent research findings on modeling actual evapotranspiration (ET) using remote sensing data and methods have proven the ability of these methods to address wide range of hydrological and water resources issues including river basin water balance for improved water resources management, drought monitoring, drought impact and socioeconomic responses, agricultural water management, optimization of land-use for water conservations, water allocation agreement among others. However, there is still a critical need to identify appropriate type of ET information that can address each of these issues. The current trend of increasing demand for water due to population growth coupled with variable and limited water supply due to drought especially in arid and semiarid regions with limited water supply have highlighted the need for such information. To properly address these issues different spatial and temporal resolutions of ET information will need to be used. For example, agricultural water management applications require ET information at field (30-m) and daily time scales while for river basin hydrologic analysis relatively coarser spatial and temporal scales can be adequate for such regional applications. The objective of this analysis is to evaluate the potential of using an integrated ET information that can be used to address some of these issues collectively. This analysis will highlight efforts to address some of the issues that are applicable to New Mexico including assessment of statewide water budget as well as drought impact and socioeconomic responses which all require ET information but at different spatial and temporal scales. This analysis will provide an evaluation of four remote sensing based ET models including ALEXI, DisALEXI, SSEBop, and SEBAL3.0. The models will be compared with ground-based observations from eddy covariance towers and water balance calculations. Remote sensing data from Landsat, MODIS, and VIIRS sensors will be used to provide ET

  6. Descriptor Based Analysis of Digital 3D Shapes

    DEFF Research Database (Denmark)

    Welnicka, Katarzyna

    Analysis and processing of 3D digital shapes is a significant research area with numerous medical, industrial, and entertainment applications which has gained enormously in importance as optical scanning modalities have started to make acquired 3D geometry commonplace. The area holds many...... challenges. One such challenge, which is addressed in this thesis, is to develop computational methods for classifying shapes which are in agreement with the human way of understanding and classifying shapes. In this dissertation we first present a shape descriptor based on the process of diffusion...

  7. Management of Microbiologically Influenced Corrosion in Risk Based Inspection analysis

    DEFF Research Database (Denmark)

    Skovhus, Torben Lund; Hillier, Elizabeth; Andersen, Erlend S.

    in the offshore industry as a means to justify the inspection strategy adopted. The RBI analysis is a decision-making technique that enables asset managers to identify the risk related to failure of their most critical systems and components, with an effect on safety, environmental and business related issues...... and discussed. From a risk perspective, MIC is not satisfactorily assessed by the current models and the models lack a proper view of the MIC threat. Therefore, a review of known parameters that affect MIC is presented. The mapping and identification of parameters is based on the review of past models...

  8. Measurement Uncertainty Analysis of the Strain Gauge Based Stabilographic Platform

    Directory of Open Access Journals (Sweden)

    Walendziuk Wojciech

    2014-08-01

    Full Text Available The present article describes constructing a stabilographic platform which records a standing patient’s deflection from their point of balance. The constructed device is composed of a toughen glass slab propped with 4 force sensors. Power transducers are connected to the measurement system based on a 24-bit ADC transducer which acquires slight body movements of a patient. The data is then transferred to the computer in real time and data analysis is conducted. The article explains the principle of operation as well as the algorithm of measurement uncertainty for the COP (Centre of Pressure surface (x, y.

  9. AMATCHMETHOD BASED ON LATENT SEMANTIC ANALYSIS FOR EARTHQUAKEHAZARD EMERGENCY PLAN

    Directory of Open Access Journals (Sweden)

    D. Sun

    2017-09-01

    Full Text Available The structure of the emergency plan on earthquake is complex, and it’s difficult for decision maker to make a decision in a short time. To solve the problem, this paper presents a match method based on Latent Semantic Analysis (LSA. After the word segmentation preprocessing of emergency plan, we carry out keywords extraction according to the part-of-speech and the frequency of words. Then through LSA, we map the documents and query information to the semantic space, and calculate the correlation of documents and queries by the relation between vectors. The experiments results indicate that the LSA can improve the accuracy of emergency plan retrieval efficiently.

  10. [A new calibration transfer method based on target factor analysis].

    Science.gov (United States)

    Wang, Yan-bin; Yuan, Hong-fu; Lu, Wan-zhen

    2005-03-01

    A new calibration transfer method based on target factor analysis is proposed.The performance of the new method compared with the piecewise direct standardization method. This method was applied to two data sets, of which one is a simulation data set, and the other is an NIR data set composed of benzene, toluene, xylene and isooctane. The results obtained with this new method are at least as well as those obtained by PDS with the biggest improvement occurring when the spectra have some non-linear responses.

  11. Electromagnetic fields from mobile phone base station - variability analysis.

    Science.gov (United States)

    Bienkowski, Pawel; Zubrzak, Bartlomiej

    2015-09-01

    The article describes the character of electromagnetic field (EMF) in mobile phone base station (BS) surroundings and its variability in time with an emphasis on the measurement difficulties related to its pulse and multi-frequency nature. Work also presents long-term monitoring measurements performed recently in different locations in Poland - small city with dispersed building development and in major polish city - dense urban area. Authors tried to determine the trends in changing of EMF spectrum analyzing daily changes of measured EMF levels in those locations. Research was performed using selective electromagnetic meters and also EMF meter with spectrum analysis.

  12. Analysis, optimization and implementation of a variable retardance based polarimeter

    Directory of Open Access Journals (Sweden)

    Moreno I.

    2010-06-01

    Full Text Available We present a comprehensive analysis, optimization and implementation of a Stokes polarimeter based on two liquid crystals acting as variable retarders. For the optimization process, the Conditional Number or the Equally Weighted Variance indicators are applied and compared as a function of different number of polarization analyzers. Moreover, some of the optimized polarimeter configurations are experimentally implemented and the influence of small experimental deviations from the optimized configuration values on the amplification of the Stokes component error is also studied. Some experimental results obtained by using the implemented polarimeters, when measuring different incidence states of polarization, are provided.

  13. Quantitative analysis of lead in polysulfide-based impression material

    Directory of Open Access Journals (Sweden)

    Aparecida Silva Braga

    2007-06-01

    Full Text Available Permlastic® is a polysulfide-based impression material widely used by dentists in Brazil. It is composed of a base paste and a catalyzer containing lead dioxide. The high toxicity of lead to humans is ground for much concern, since it can attack various systems and organs. The present study involved a quantitative analysis of the concentration of lead in the material Permlastic®. The lead was determined by plasma-induced optical emission spectrometry (Varian model Vista. The percentages of lead found in the two analyzed lots were 38.1 and 40.8%. The lead concentrations in the material under study were high, but the product’s packaging contained no information about these concentrations.

  14. Linear feature selection in texture analysis - A PLS based method

    DEFF Research Database (Denmark)

    Marques, Joselene; Igel, Christian; Lillholm, Martin

    2013-01-01

    We present a texture analysis methodology that combined uncommitted machine-learning techniques and partial least square (PLS) in a fully automatic framework. Our approach introduces a robust PLS-based dimensionality reduction (DR) step to specifically address outliers and high-dimensional feature......, which first applied a PLS regression to rank the features and then defined the best number of features to retain in the model by an iterative learning phase. The outliers in the dataset, that could inflate the number of selected features, were eliminated by a pre-processing step. To cope...... and considering all CV groups, the methods selected 36 % of the original features available. The diagnosis evaluation reached a generalization area-under-the-ROC curve of 0.92, which was higher than established cartilage-based markers known to relate to OA diagnosis....

  15. Content and user-based music visual analysis

    Science.gov (United States)

    Guo, Xiaochun; Tang, Lei

    2015-12-01

    In recent years, people's ability to collect music got enhanced greatly. Many people who prefer listening music offline even stored thousands of music on their local storage or portable device. However, their ability to deal with music information has not been improved accordingly, which results in two problems. One is how to find out the favourite songs from large music dataset and satisfy different individuals. The other one is how to compose a play list quickly. To solve these problems, the authors proposed a content and user-based music visual analysis approach. We first developed a new recommendation algorithm based on the content of music and user's behaviour, which satisfy individual's preference. Then, we make use of visualization and interaction tools to illustrate the relationship between songs and help people compose a suitable play list. At the end of this paper, a survey is mentioned to show that our system is available and effective.

  16. Independent Component Analysis applied to Ground-based observations

    Science.gov (United States)

    Martins-Filho, Walter; Griffith, Caitlin; Pearson, Kyle; Waldmann, Ingo; Alvarez-Candal, Alvaro; Zellem, Robert Thomas

    2018-01-01

    Transit measurements of Jovian-sized exoplanetary atmospheres allow one to study the composition of exoplanets, largely independent of the planet’s temperature profile. However, measurements of hot-Jupiter transits must archive a level of accuracy in the flux to determine the spectral modulation of the exoplanetary atmosphere. To accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth’s atmosphere, from signal due to the exoplanet, which is several orders of magnitude smaller. The effects of the terrestrial atmosphere and some of the time-dependent systematic errors of ground-based transit measurements are treated mainly by dividing the host star by a reference star at each wavelength and time step of the transit. Recently, Independent Component Analysis (ICA) have been used to remove systematics effects from the raw data of space-based observations (Waldmann, 2014, 2012; Morello et al., 2016, 2015). ICA is a statistical method born from the ideas of the blind-source separations studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). This technique requires no additional prior knowledge of the data set. In addition, this technique has the advantage of requiring no reference star. Here we apply the ICA to ground-based photometry of the exoplanet XO-2b recorded by the 61” Kuiper Telescope and compare the results of the ICA to those of a previous analysis from Zellem et al. (2015), which does not use ICA. We also simulate the effects of various conditions (concerning the systematic errors, noise and the stability of object on the detector) to determine the conditions under which an ICA can be used with high precision to extract the light curve of exoplanetary photometry measurements

  17. Phishing Detection: Analysis of Visual Similarity Based Approaches

    Directory of Open Access Journals (Sweden)

    Ankit Kumar Jain

    2017-01-01

    Full Text Available Phishing is one of the major problems faced by cyber-world and leads to financial losses for both industries and individuals. Detection of phishing attack with high accuracy has always been a challenging issue. At present, visual similarities based techniques are very useful for detecting phishing websites efficiently. Phishing website looks very similar in appearance to its corresponding legitimate website to deceive users into believing that they are browsing the correct website. Visual similarity based phishing detection techniques utilise the feature set like text content, text format, HTML tags, Cascading Style Sheet (CSS, image, and so forth, to make the decision. These approaches compare the suspicious website with the corresponding legitimate website by using various features and if the similarity is greater than the predefined threshold value then it is declared phishing. This paper presents a comprehensive analysis of phishing attacks, their exploitation, some of the recent visual similarity based approaches for phishing detection, and its comparative study. Our survey provides a better understanding of the problem, current solution space, and scope of future research to deal with phishing attacks efficiently using visual similarity based approaches.

  18. Semiconductor product analysis challenges based on the 1999 ITRS

    International Nuclear Information System (INIS)

    Joseph, Thomas W.; Anderson, Richard E.; Gilfeather, Glen; LeClaire, Carole; Yim, Daniel

    2001-01-01

    One of the most significant challenges for technology characterization and failure analysis is to keep instrumentation and techniques in step with the development of technology itself. Not only are dimensions shrinking and new materials being employed, but the rate of change is increasing. According to the 1999 International Technology Roadmap for Semiconductors, 'The number and difficulty of the technical challenges continue to increase as technology moves forward'. It could be argued that technology cannot be developed without appropriate analytical techniques; nevertheless while much effort is being directed at materials and processes, only a small proportion is being directed at analysis. Whereas previous versions of the Semiconductor Industry Association roadmap contained a small number of implicit references to characterization and analysis, the 1999 ITRS contains many explicit references. It is clear that characterization is now woven through the roadmap, and technology developers in all areas appreciate the fact that new instrumentation and techniques will be required to sustain the rate of development the semiconductor industry has seen in recent years. Late in 1999, a subcommittee of the Sematech Product Analysis Forum (PAF) reviewed the ITRS and identified a 'top-ten' list of challenges which the failure analysis community will face as present technologies are extended and future technologies are developed. This paper discusses the PAF top-ten list of challenges, which is based primarily on the Difficult Challenges tables from each ITRS working group. Eight of the top-ten are challenges of significant technical magnitude; only two could be considered non-technical in nature. Most of these challenges cut across several working group areas and could be considered common threads in the roadmap, ranging from fault simulation and modeling to imaging small features, from electrical defect isolation to deprocessing. While evolutionary changes can be anticipated

  19. AntDAS: Automatic Data Analysis Strategy for UPLC-QTOF-Based Nontargeted Metabolic Profiling Analysis.

    Science.gov (United States)

    Fu, Hai-Yan; Guo, Xiao-Ming; Zhang, Yue-Ming; Song, Jing-Jing; Zheng, Qing-Xia; Liu, Ping-Ping; Lu, Peng; Chen, Qian-Si; Yu, Yong-Jie; She, Yuanbin

    2017-10-17

    High-quality data analysis methodology remains a bottleneck for metabolic profiling analysis based on ultraperformance liquid chromatography-quadrupole time-of-flight mass spectrometry. The present work aims to address this problem by proposing a novel data analysis strategy wherein (1) chromatographic peaks in the UPLC-QTOF data set are automatically extracted by using an advanced multiscale Gaussian smoothing-based peak extraction strategy; (2) a peak annotation stage is used to cluster fragment ions that belong to the same compound. With the aid of high-resolution mass spectrometer, (3) a time-shift correction across the samples is efficiently performed by a new peak alignment method; (4) components are registered by using a newly developed adaptive network searching algorithm; (5) statistical methods, such as analysis of variance and hierarchical cluster analysis, are then used to identify the underlying marker compounds; finally, (6) compound identification is performed by matching the extracted peak information, involving high-precision m/z and retention time, against our compound library containing more than 500 plant metabolites. A manually designed mixture of 18 compounds is used to evaluate the performance of the method, and all compounds are detected under various concentration levels. The developed method is comprehensively evaluated by an extremely complex plant data set containing more than 2000 components. Results indicate that the performance of the developed method is comparable with the XCMS. The MATLAB GUI code is available from http://software.tobaccodb.org/software/antdas .

  20. Flood Risk Assessment Based On Security Deficit Analysis

    Science.gov (United States)

    Beck, J.; Metzger, R.; Hingray, B.; Musy, A.

    Risk is a human perception: a given risk may be considered as acceptable or unac- ceptable depending on the group that has to face that risk. Flood risk analysis of- ten estimates economic losses from damages, but neglects the question of accept- able/unacceptable risk. With input from land use managers, politicians and other stakeholders, risk assessment based on security deficit analysis determines objects with unacceptable risk and their degree of security deficit. Such a risk assessment methodology, initially developed by the Swiss federal authorities, is illustrated by its application on a reach of the Alzette River (Luxembourg) in the framework of the IRMA-SPONGE FRHYMAP project. Flood risk assessment always involves a flood hazard analysis, an exposed object vulnerability analysis, and an analysis combing the results of these two previous analyses. The flood hazard analysis was done with the quasi-2D hydraulic model FldPln to produce flood intensity maps. Flood intensity was determined by the water height and velocity. Object data for the vulnerability analysis, provided by the Luxembourg government, were classified according to their potential damage. Potential damage is expressed in terms of direct, human life and secondary losses. A thematic map was produced to show the object classification. Protection goals were then attributed to the object classes. Protection goals are assigned in terms of an acceptable flood intensity for a certain flood frequency. This is where input from land use managers and politicians comes into play. The perception of risk in the re- gion or country influences the protection goal assignment. Protection goals as used in Switzerland were used in this project. Thematic maps showing the protection goals of each object in the case study area for a given flood frequency were produced. Com- parison between an object's protection goal and the intensity of the flood that touched the object determine the acceptability of the risk and the

  1. Streaming Support for Data Intensive Cloud-Based Sequence Analysis

    Science.gov (United States)

    Issa, Shadi A.; Kienzler, Romeo; El-Kalioby, Mohamed; Tonellato, Peter J.; Wall, Dennis; Bruggmann, Rémy; Abouelhoda, Mohamed

    2013-01-01

    Cloud computing provides a promising solution to the genomics data deluge problem resulting from the advent of next-generation sequencing (NGS) technology. Based on the concepts of “resources-on-demand” and “pay-as-you-go”, scientists with no or limited infrastructure can have access to scalable and cost-effective computational resources. However, the large size of NGS data causes a significant data transfer latency from the client's site to the cloud, which presents a bottleneck for using cloud computing services. In this paper, we provide a streaming-based scheme to overcome this problem, where the NGS data is processed while being transferred to the cloud. Our scheme targets the wide class of NGS data analysis tasks, where the NGS sequences can be processed independently from one another. We also provide the elastream package that supports the use of this scheme with individual analysis programs or with workflow systems. Experiments presented in this paper show that our solution mitigates the effect of data transfer latency and saves both time and cost of computation. PMID:23710461

  2. Bayesian analysis of physiologically based toxicokinetic and toxicodynamic models.

    Science.gov (United States)

    Hack, C Eric

    2006-04-17

    Physiologically based toxicokinetic (PBTK) and toxicodynamic (TD) models of bromate in animals and humans would improve our ability to accurately estimate the toxic doses in humans based on available animal studies. These mathematical models are often highly parameterized and must be calibrated in order for the model predictions of internal dose to adequately fit the experimentally measured doses. Highly parameterized models are difficult to calibrate and it is difficult to obtain accurate estimates of uncertainty or variability in model parameters with commonly used frequentist calibration methods, such as maximum likelihood estimation (MLE) or least squared error approaches. The Bayesian approach called Markov chain Monte Carlo (MCMC) analysis can be used to successfully calibrate these complex models. Prior knowledge about the biological system and associated model parameters is easily incorporated in this approach in the form of prior parameter distributions, and the distributions are refined or updated using experimental data to generate posterior distributions of parameter estimates. The goal of this paper is to give the non-mathematician a brief description of the Bayesian approach and Markov chain Monte Carlo analysis, how this technique is used in risk assessment, and the issues associated with this approach.

  3. Analysis of Android Device-Based Solutions for Fall Detection.

    Science.gov (United States)

    Casilari, Eduardo; Luque, Rafael; Morón, María-José

    2015-07-23

    Falls are a major cause of health and psychological problems as well as hospitalization costs among older adults. Thus, the investigation on automatic Fall Detection Systems (FDSs) has received special attention from the research community during the last decade. In this area, the widespread popularity, decreasing price, computing capabilities, built-in sensors and multiplicity of wireless interfaces of Android-based devices (especially smartphones) have fostered the adoption of this technology to deploy wearable and inexpensive architectures for fall detection. This paper presents a critical and thorough analysis of those existing fall detection systems that are based on Android devices. The review systematically classifies and compares the proposals of the literature taking into account different criteria such as the system architecture, the employed sensors, the detection algorithm or the response in case of a fall alarms. The study emphasizes the analysis of the evaluation methods that are employed to assess the effectiveness of the detection process. The review reveals the complete lack of a reference framework to validate and compare the proposals. In addition, the study also shows that most research works do not evaluate the actual applicability of the Android devices (with limited battery and computing resources) to fall detection solutions.

  4. Research on cloud-based remote measurement and analysis system

    Science.gov (United States)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  5. Principle-based concept analysis: intentionality in holistic nursing theories.

    Science.gov (United States)

    Aghebati, Nahid; Mohammadi, Eesa; Ahmadi, Fazlollah; Noaparast, Khosrow Bagheri

    2015-03-01

    This is a report of a principle-based concept analysis of intentionality in holistic nursing theories. A principle-based concept analysis method was used to analyze seven holistic theories. The data included eight books and 31 articles (1998-2011), which were retrieved through MEDLINE and CINAHL. Erickson, Kriger, Parse, Watson, and Zahourek define intentionality as a capacity, a focused consciousness, and a pattern of human being. Rogers and Newman do not explicitly mention intentionality; however, they do explain pattern and consciousness (epistemology). Intentionality has been operationalized as a core concept of nurse-client relationships (pragmatic). The theories are consistent on intentionality as a noun and as an attribute of the person-intentionality is different from intent and intention (linguistic). There is ambiguity concerning the boundaries between intentionality and consciousness (logic). Theoretically, intentionality is an evolutionary capacity to integrate human awareness and experience. Because intentionality is an individualized concept, we introduced it as "a matrix of continuous known changes" that emerges in two forms: as a capacity of human being and as a capacity of transpersonal caring. This study has produced a theoretical definition of intentionality and provides a foundation for future research to further investigate intentionality to better delineate its boundaries. © The Author(s) 2014.

  6. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2018-03-01

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  7. Applying model analysis to a resource-based analysis of the Force and Motion Conceptual Evaluation

    Directory of Open Access Journals (Sweden)

    Trevor I. Smith

    2014-07-01

    Full Text Available Previously, we analyzed the Force and Motion Conceptual Evaluation in terms of a resources-based model that allows for clustering of questions so as to provide useful information on how students correctly or incorrectly reason about physics. In this paper, we apply model analysis to show that the associated model plots provide more information regarding the results of investigations using these question clusters than normalized gain graphs. We provide examples from two different institutions to show how the use of model analysis with our redefined clusters can provide previously hidden insight into the effectiveness of instruction.

  8. Heating Analysis in Constant-pressure Hydraulic System based on Energy Analysis

    Science.gov (United States)

    Wu, Chao; Xu, Cong; Mao, Xuyao; Li, Bin; Hu, Junhua; Liu, Yiou

    2017-12-01

    Hydraulic systems are widely used in industrial applications, but the problem of heating has become an important reason to restrict the promotion of hydraulic technology. The high temperature, will seriously affect the operation of the hydraulic system, even cause stuck and other serious failure. Based on the analysis of the heat damage of the hydraulic system, this paper gives the reasons for this problem, and it is showed by the application that the energy analysis can accurately locate the main reasons for the heating of the hydraulic system, which can give strong practical guidance.

  9. A Web-Based Development Environment for Collaborative Data Analysis

    CERN Document Server

    Erdmann, M; Glaser, C; Klingebiel, D; Komm, M; Müller, G; Rieger, M; Steggemann, J; Urban, M; Winchen, T

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis ow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-congurable connections to remote machines supplying resources and local le access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and a...

  10. Coordinate-based versus structural approaches to brain image analysis.

    Science.gov (United States)

    Mangin, J-F; Rivière, D; Coulon, O; Poupon, C; Cachia, A; Cointepas, Y; Poline, J-B; Le Bihan, D; Régis, J; Papadopoulos-Orfanos, D

    2004-02-01

    A basic issue in neurosciences is to look for possible relationships between brain architecture and cognitive models. The lack of architectural information in magnetic resonance images, however, has led the neuroimaging community to develop brain mapping strategies based on various coordinate systems without accurate architectural content. Therefore, the relationships between architectural and functional brain organizations are difficult to study when analyzing neuroimaging experiments. This paper advocates that the design of new brain image analysis methods inspired by the structural strategies often used in computer vision may provide better ways to address these relationships. The key point underlying this new framework is the conversion of the raw images into structural representations before analysis. These representations are made up of data-driven elementary features like activated clusters, cortical folds or fiber bundles. Two classes of methods are introduced. Inference of structural models via matching across a set of individuals is described first. This inference problem is illustrated by the group analysis of functional statistical parametric maps (SPMs). Then, the matching of new individual data with a priori known structural models is described, using the recognition of the cortical sulci as a prototypical example.

  11. A Web-Based Development Environment for Collaborative Data Analysis

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  12. A Web-Based Development Environment for Collaborative Data Analysis

    International Nuclear Information System (INIS)

    Erdmann, M; Fischer, R; Glaser, C; Klingebiel, D; Müller, G; Rieger, M; Urban, M; Winchen, T; Komm, M; Steggemann, J

    2014-01-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable

  13. Asymptotic performance of regularized quadratic discriminant analysis based classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-12-13

    This paper carries out a large dimensional analysis of the standard regularized quadratic discriminant analysis (QDA) classifier designed on the assumption that data arise from a Gaussian mixture model. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that depends only on the covariances and means associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized QDA and can be used to determine the optimal regularization parameter that minimizes the misclassification error probability. Despite being valid only for Gaussian data, our theoretical findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from popular real data bases, thereby making an interesting connection between theory and practice.

  14. Glyph-Based Video Visualization for Semen Analysis

    KAUST Repository

    Duffy, Brian

    2015-08-01

    © 2013 IEEE. The existing efforts in computer assisted semen analysis have been focused on high speed imaging and automated image analysis of sperm motility. This results in a large amount of data, and it is extremely challenging for both clinical scientists and researchers to interpret, compare and correlate the multidimensional and time-varying measurements captured from video data. In this work, we use glyphs to encode a collection of numerical measurements taken at a regular interval and to summarize spatio-temporal motion characteristics using static visual representations. The design of the glyphs addresses the needs for (a) encoding some 20 variables using separable visual channels, (b) supporting scientific observation of the interrelationships between different measurements and comparison between different sperm cells and their flagella, and (c) facilitating the learning of the encoding scheme by making use of appropriate visual abstractions and metaphors. As a case study, we focus this work on video visualization for computer-aided semen analysis, which has a broad impact on both biological sciences and medical healthcare. We demonstrate that glyph-based visualization can serve as a means of external memorization of video data as well as an overview of a large set of spatiotemporal measurements. It enables domain scientists to make scientific observation in a cost-effective manner by reducing the burden of viewing videos repeatedly, while providing them with a new visual representation for conveying semen statistics.

  15. SVM-based glioma grading. Optimization by feature reduction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zoellner, Frank G.; Schad, Lothar R. [University Medical Center Mannheim, Heidelberg Univ., Mannheim (Germany). Computer Assisted Clinical Medicine; Emblem, Kyrre E. [Massachusetts General Hospital, Charlestown, A.A. Martinos Center for Biomedical Imaging, Boston MA (United States). Dept. of Radiology; Harvard Medical School, Boston, MA (United States); Oslo Univ. Hospital (Norway). The Intervention Center

    2012-11-01

    We investigated the predictive power of feature reduction analysis approaches in support vector machine (SVM)-based classification of glioma grade. In 101 untreated glioma patients, three analytic approaches were evaluated to derive an optimal reduction in features; (i) Pearson's correlation coefficients (PCC), (ii) principal component analysis (PCA) and (iii) independent component analysis (ICA). Tumor grading was performed using a previously reported SVM approach including whole-tumor cerebral blood volume (CBV) histograms and patient age. Best classification accuracy was found using PCA at 85% (sensitivity = 89%, specificity = 84%) when reducing the feature vector from 101 (100-bins rCBV histogram + age) to 3 principal components. In comparison, classification accuracy by PCC was 82% (89%, 77%, 2 dimensions) and 79% by ICA (87%, 75%, 9 dimensions). For improved speed (up to 30%) and simplicity, feature reduction by all three methods provided similar classification accuracy to literature values ({proportional_to}87%) while reducing the number of features by up to 98%. (orig.)

  16. Cost Risk Analysis Based on Perception of the Engineering Process

    Science.gov (United States)

    Dean, Edwin B.; Wood, Darrell A.; Moore, Arlene A.; Bogart, Edward H.

    1986-01-01

    In most cost estimating applications at the NASA Langley Research Center (LaRC), it is desirable to present predicted cost as a range of possible costs rather than a single predicted cost. A cost risk analysis generates a range of cost for a project and assigns a probability level to each cost value in the range. Constructing a cost risk curve requires a good estimate of the expected cost of a project. It must also include a good estimate of expected variance of the cost. Many cost risk analyses are based upon an expert's knowledge of the cost of similar projects in the past. In a common scenario, a manager or engineer, asked to estimate the cost of a project in his area of expertise, will gather historical cost data from a similar completed project. The cost of the completed project is adjusted using the perceived technical and economic differences between the two projects. This allows errors from at least three sources. The historical cost data may be in error by some unknown amount. The managers' evaluation of the new project and its similarity to the old project may be in error. The factors used to adjust the cost of the old project may not correctly reflect the differences. Some risk analyses are based on untested hypotheses about the form of the statistical distribution that underlies the distribution of possible cost. The usual problem is not just to come up with an estimate of the cost of a project, but to predict the range of values into which the cost may fall and with what level of confidence the prediction is made. Risk analysis techniques that assume the shape of the underlying cost distribution and derive the risk curve from a single estimate plus and minus some amount usually fail to take into account the actual magnitude of the uncertainty in cost due to technical factors in the project itself. This paper addresses a cost risk method that is based on parametric estimates of the technical factors involved in the project being costed. The engineering

  17. Technoeconomic analysis of a biomass based district heating system

    International Nuclear Information System (INIS)

    Zhang, H.; Ugursal, V.I.; Fung, A.

    2005-01-01

    This paper discussed a proposed biomass-based district heating system to be built for the Pictou Landing First Nation Community in Nova Scotia. The community centre consists of 6 buildings and a connecting arcade. The methodology used to size and design heating, ventilating and air conditioning (HVAC) systems, as well as biomass district energy systems (DES) were discussed. Annual energy requirements and biomass fuel consumption predictions were presented, along with cost estimates. A comparative assessment of the system with that of a conventional oil fired system was also conducted. It was suggested that the design and analysis methodology could be used for any similar application. The buildings were modelled and simulated using the Hourly Analysis Program (HAP), a detailed 2-in-1 software program which can be used both for HVAC system sizing and building energy consumption estimation. A techno-economics analysis was conducted to justify the viability of the biomass combustion system. Heating load calculations were performed assuming that the thermostat was set constantly at 22 degrees C. Community centre space heating loads due to individual envelope components for 3 different scenarios were summarized, as the design architecture for the buildings was not yet finalized. It was suggested that efforts should be made to ensure air-tightness and insulation levels of the interior arcade glass wall. A hydronic distribution system with baseboard space heating units was selected, comprising of a woodchip boiler, hot water distribution system, convective heating units and control systems. The community has its own logging operation which will provide the wood fuel required by the proposed system. An outline of the annual allowable harvest covered by the Pictou Landing Forestry Management Plan was presented, with details of proposed wood-chippers for the creation of biomass. It was concluded that the woodchip combustion system is economically preferable to the

  18. Overview description of the base scenario derived from FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    , subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. This report uses the conceptual models developed from the FEP analysis to present a description of the base scenario, in terms of the processes to be represented in detailed models. This report does not present an assessment of the base scenario, but rather seeks to provide a summary of those features, events and processes that should be represented, at an appropriate level of detail, within numerical models. The requirements for the development of appropriate models for representing the base scenario are described in an underlying report within the model development document suite. (author)

  19. Variance-based sensitivity analysis for wastewater treatment plant modelling.

    Science.gov (United States)

    Cosenza, Alida; Mannina, Giorgio; Vanrolleghem, Peter A; Neumann, Marc B

    2014-02-01

    Global sensitivity analysis (GSA) is a valuable tool to support the use of mathematical models that characterise technical or natural systems. In the field of wastewater modelling, most of the recent applications of GSA use either regression-based methods, which require close to linear relationships between the model outputs and model factors, or screening methods, which only yield qualitative results. However, due to the characteristics of membrane bioreactors (MBR) (non-linear kinetics, complexity, etc.) there is an interest to adequately quantify the effects of non-linearity and interactions. This can be achieved with variance-based sensitivity analysis methods. In this paper, the Extended Fourier Amplitude Sensitivity Testing (Extended-FAST) method is applied to an integrated activated sludge model (ASM2d) for an MBR system including microbial product formation and physical separation processes. Twenty-one model outputs located throughout the different sections of the bioreactor and 79 model factors are considered. Significant interactions among the model factors are found. Contrary to previous GSA studies for ASM models, we find the relationship between variables and factors to be non-linear and non-additive. By analysing the pattern of the variance decomposition along the plant, the model factors having the highest variance contributions were identified. This study demonstrates the usefulness of variance-based methods in membrane bioreactor modelling where, due to the presence of membranes and different operating conditions than those typically found in conventional activated sludge systems, several highly non-linear effects are present. Further, the obtained results highlight the relevant role played by the modelling approach for MBR taking into account simultaneously biological and physical processes. © 2013.

  20. Cloud-based data-proximate visualization and analysis

    Science.gov (United States)

    Fisher, Ward

    2017-04-01

    The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. The challenge now becomes creating tools which are cloud-ready. The solution to this challenge is provided by Application Streaming. This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has harnessed Application Streaming to provide a cloud-capable version of our visualization software, the Integrated Data Viewer (IDV). This work will examine the challenges associated with adapting the IDV to an application streaming platform, and include a brief discussion of the underlying technologies involved.

  1. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  2. Extent of Endoscopic Resection for Anterior Skull Base Tumors: An MRI-Based Volumetric Analysis.

    Science.gov (United States)

    Koszewski, Ian J; Avey, Gregory; Ahmed, Azam; Leonhard, Lucas; Hoffman, Matthew R; McCulloch, Timothy M

    2017-06-01

    Objective  To determine the volume of ventral skull base tumor removed following endoscopic endonasal (EEA) resection using MRI-based volumetric analysis and to evaluate the inter-rater reliability of such analysis. Design  Retrospective case series. Setting  Academic tertiary care hospital. Participants  EEA patients November 2012 to August 2015. Main Outcome Measures  Volumetric analysis of pre- and immediately postoperative MR imaging was performed independently by two investigators. The percentage of total tumor resected was evaluated according to resection goal and tumor type. Results  A total of 39 patients underwent resection. Intraclass correlation coefficients between the raters were 0.9988 for preoperative and 0.9819 for postoperative images. Tumors (and average percentage removed) included 17 nonsecreting pituitary adenomas (95.3%), 8 secreting pituitary adenomas (86.2%), 4 meningiomas (81.6%), 3 olfactory neuroblastomas (100%), 2 craniopharyngiomas (100%), 1 large B-cell lymphoma (90.5%), 1 germ cell neoplasm (48.3), 1 benign fibrous connective tissue mass (93.4%), 1 epidermoid cyst (68.4%), and 1 chordoma (100%). For tumors treated with intent for gross total resection, 96.9 ± 4.8% was removed. Conclusion  EEAs achieved tumor resection rates of ∼97% when total resection was attempted. The radiographic finding of residual tumor is of uncertain clinical significance. The volumetric analysis employed in this study demonstrated high inter-rater reliability and could facilitate further study.

  3. A Novel Multiobjective Evolutionary Algorithm Based on Regression Analysis

    Directory of Open Access Journals (Sweden)

    Zhiming Song

    2015-01-01

    Full Text Available As is known, the Pareto set of a continuous multiobjective optimization problem with m objective functions is a piecewise continuous (m-1-dimensional manifold in the decision space under some mild conditions. However, how to utilize the regularity to design multiobjective optimization algorithms has become the research focus. In this paper, based on this regularity, a model-based multiobjective evolutionary algorithm with regression analysis (MMEA-RA is put forward to solve continuous multiobjective optimization problems with variable linkages. In the algorithm, the optimization problem is modelled as a promising area in the decision space by a probability distribution, and the centroid of the probability distribution is (m-1-dimensional piecewise continuous manifold. The least squares method is used to construct such a model. A selection strategy based on the nondominated sorting is used to choose the individuals to the next generation. The new algorithm is tested and compared with NSGA-II and RM-MEDA. The result shows that MMEA-RA outperforms RM-MEDA and NSGA-II on the test instances with variable linkages. At the same time, MMEA-RA has higher efficiency than the other two algorithms. A few shortcomings of MMEA-RA have also been identified and discussed in this paper.

  4. Syndrome identification based on 2D analysis software.

    Science.gov (United States)

    Boehringer, Stefan; Vollmar, Tobias; Tasse, Christiane; Wurtz, Rolf P; Gillessen-Kaesbach, Gabriele; Horsthemke, Bernhard; Wieczorek, Dagmar

    2006-10-01

    Clinical evaluation of children with developmental delay continues to present a challenge to the clinicians. In many cases, the face provides important information to diagnose a condition. However, database support with respect to facial traits is limited at present. Computer-based analyses of 2D and 3D representations of faces have been developed, but it is unclear how well a larger number of conditions can be handled by such systems. We have therefore analysed 2D pictures of patients each being affected with one of 10 syndromes (fragile X syndrome; Cornelia de Lange syndrome; Williams-Beuren syndrome; Prader-Willi syndrome; Mucopolysaccharidosis type III; Cri-du-chat syndrome; Smith-Lemli-Opitz syndrome; Sotos syndrome; Microdeletion 22q11.2; Noonan syndrome). We can show that a classification accuracy of >75% can be achieved for a computer-based diagnosis among the 10 syndromes, which is about the same accuracy achieved for five syndromes in a previous study. Pairwise discrimination of syndromes ranges from 80 to 99%. Furthermore, we can demonstrate that the criteria used by the computer decisions match clinical observations in many cases. These findings indicate that computer-based picture analysis might be a helpful addition to existing database systems, which are meant to assist in syndrome diagnosis, especially as data acquisition is straightforward and involves off-the-shelf digital camera equipment.

  5. Aerodynamic flight evaluation analysis and data base update

    Science.gov (United States)

    Boyle, W. W.; Miller, M. S.; Wilder, G. O.; Reheuser, R. D.; Sharp, R. S.; Bridges, G. I.

    1989-01-01

    Research was conducted to determine the feasibility of replacing the Solid Rocket Boosters on the existing Space Shuttle Launch Vehicle (SSLV) with Liquid Rocket Boosters (LRB). As a part of the LRB selection process, a series of wind tunnel tests were conducted along with aero studies to determine the effects of different LRB configurations on the SSLV. Final results were tabulated into increments and added to the existing SSLV data base. The research conducted in this study was taken from a series of wind tunnel tests conducted at Marshall's 14-inch Trisonic Wind Tunnel. The effects on the axial force (CAF), normal force (CNF), pitching moment (CMF), side force (CY), wing shear force (CSR), wing torque moment (CTR), and wing bending moment (CBR) coefficients were investigated for a number of candidate LRB configurations. The aero effects due to LRB protuberances, ET/LRB separation distance, and aft skirts were also gathered from the tests. Analysis was also conducted to investigate the base pressure and plume effects due to the new booster geometries. The test results found in Phases 1 and 2 of wind tunnel testing are discussed and compared. Preliminary LRB lateral/directional data results and trends are given. The protuberance and gap/skirt effects are discussed. The base pressure/plume effects study is discussed and results are given.

  6. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  7. Reliability Analysis of Aircraft Equipment Based on FMECA Method

    Science.gov (United States)

    Jun, Li; Huibin, Xu

    It is well known that reliability of aircraft equipment is very important to aircraft during the flight because performance of aircraft product can affect flight safe directly. In order to make the equipment work normally, FMECA is applied in an aircraft equipment to analyze its reliability and improve operational reliability of the product. Through its reliability mathematical model, average of operational time is predicted based on calculating failure probability of all electrical components. According to the process of reliability theory FMECA, all kinds of the failure mode, reasons, effects and criticality of the products can be determined completely. By comparing these criticality data as shown, the paper analyses adopted method by that the contents, accents and operating process of maintenance may be instituted finally. FMECA-based method for reliability analysis of the equipment and the equipment maintenance performs well. The results indicate that application of FMECA method can analyze reliability in detail and improve operational reliability of the equipment. Therefore this will supply theoretical bases and concrete measures of maintenance of the products to improve operational reliability of products. FMECA can be feasible and effective for improving operational reliability of all aircraft equipments.

  8. GPU-based Integration with Application in Sensitivity Analysis

    Science.gov (United States)

    Atanassov, Emanouil; Ivanovska, Sofiya; Karaivanova, Aneta; Slavov, Dimitar

    2010-05-01

    The presented work is an important part of the grid application MCSAES (Monte Carlo Sensitivity Analysis for Environmental Studies) which aim is to develop an efficient Grid implementation of a Monte Carlo based approach for sensitivity studies in the domains of Environmental modelling and Environmental security. The goal is to study the damaging effects that can be caused by high pollution levels (especially effects on human health), when the main modeling tool is the Danish Eulerian Model (DEM). Generally speaking, sensitivity analysis (SA) is the study of how the variation in the output of a mathematical model can be apportioned to, qualitatively or quantitatively, different sources of variation in the input of a model. One of the important classes of methods for Sensitivity Analysis are Monte Carlo based, first proposed by Sobol, and then developed by Saltelli and his group. In MCSAES the general Saltelli procedure has been adapted for SA of the Danish Eulerian model. In our case we consider as factors the constants determining the speeds of the chemical reactions in the DEM and as output a certain aggregated measure of the pollution. Sensitivity simulations lead to huge computational tasks (systems with up to 4 × 109 equations at every time-step, and the number of time-steps can be more than a million) which motivates its grid implementation. MCSAES grid implementation scheme includes two main tasks: (i) Grid implementation of the DEM, (ii) Grid implementation of the Monte Carlo integration. In this work we present our new developments in the integration part of the application. We have developed an algorithm for GPU-based generation of scrambled quasirandom sequences which can be combined with the CPU-based computations related to the SA. Owen first proposed scrambling of Sobol sequence through permutation in a manner that improves the convergence rates. Scrambling is necessary not only for error analysis but for parallel implementations. Good scrambling is

  9. Interpretation of motion analysis of laparoscopic instruments based on principal component analysis in box trainer settings.

    Science.gov (United States)

    Oropesa, Ignacio; Escamirosa, Fernando Pérez; Sánchez-Margallo, Juan A; Enciso, Silvia; Rodríguez-Vila, Borja; Martínez, Arturo Minor; Sánchez-Margallo, Francisco M; Gómez, Enrique J; Sánchez-González, Patricia

    2018-01-18

    Motion analysis parameters (MAPs) have been extensively validated for assessment of minimally invasive surgical skills. However, there are discrepancies on how specific MAPs, tasks, and skills match with each other, reflecting that motion analysis cannot be generalized independently of the learning outcomes of a task. Additionally, there is a lack of knowledge on the meaning of motion analysis in terms of surgical skills, making difficult the provision of meaningful, didactic feedback. In this study, new higher significance MAPs (HSMAPs) are proposed, validated, and discussed for the assessment of technical skills in box trainers, based on principal component analysis (PCA). Motion analysis data were collected from 25 volunteers performing three box trainer tasks (peg grasping/PG, pattern cutting/PC, knot suturing/KS) using the EVA tracking system. PCA was applied on 10 MAPs for each task and hand. Principal components were trimmed to those accounting for an explained variance > 80% to define the HSMAPs. Individual contributions of MAPs to HSMAPs were obtained by loading analysis and varimax rotation. Construct validity of the new HSMAPs was carried out at two levels of experience based on number of surgeries. Three new HSMAPs per hand were defined for PG and PC tasks, and two per hand for KS task. PG presented validity for HSMAPs related to insecurity and economy of space. PC showed validity for HSMAPs related to cutting efficacy, peripheral unawareness, and confidence. Finally, KS presented validity for HSMAPs related with economy of space and knotting security. PCA-defined HSMAPs can be used for technical skills' assessment. Construct validation and expert knowledge can be combined to infer how competences are acquired in box trainer tasks. These findings can be exploited to provide residents with meaningful feedback on performance. Future works will compare the new HSMAPs with valid scoring systems such as GOALS.

  10. Canonical correlation analysis for gene-based pleiotropy discovery.

    Directory of Open Access Journals (Sweden)

    Jose A Seoane

    2014-10-01

    Full Text Available Genome-wide association studies have identified a wealth of genetic variants involved in complex traits and multifactorial diseases. There is now considerable interest in testing variants for association with multiple phenotypes (pleiotropy and for testing multiple variants for association with a single phenotype (gene-based association tests. Such approaches can increase statistical power by combining evidence for association over multiple phenotypes or genetic variants respectively. Canonical Correlation Analysis (CCA measures the correlation between two sets of multidimensional variables, and thus offers the potential to combine these two approaches. To apply CCA, we must restrict the number of attributes relative to the number of samples. Hence we consider modules of genetic variation that can comprise a gene, a pathway or another biologically relevant grouping, and/or a set of phenotypes. In order to do this, we use an attribute selection strategy based on a binary genetic algorithm. Applied to a UK-based prospective cohort study of 4286 women (the British Women's Heart and Health Study, we find improved statistical power in the detection of previously reported genetic associations, and identify a number of novel pleiotropic associations between genetic variants and phenotypes. New discoveries include gene-based association of NSF with triglyceride levels and several genes (ACSM3, ERI2, IL18RAP, IL23RAP and NRG1 with left ventricular hypertrophy phenotypes. In multiple-phenotype analyses we find association of NRG1 with left ventricular hypertrophy phenotypes, fibrinogen and urea and pleiotropic relationships of F7 and F10 with Factor VII, Factor IX and cholesterol levels.

  11. Sensitivity Analysis of a process based erosion model using FAST

    Science.gov (United States)

    Gabelmann, Petra; Wienhöfer, Jan; Zehe, Erwin

    2015-04-01

    deposition are related to overland flow velocity using the equation of Engelund and Hansen and the sinking velocity of grain sizes, respectively. The sensitivity analysis was performed based on virtual hillslopes similar to those in the Weiherbach catchment. We applied the FAST-method (Fourier Amplitude Sensitivity Test), which provides a global sensitivity analysis with comparably few model runs. We varied model parameters in predefined and, for the Weiherbach catchment, physically meaningful parameter ranges. Those parameters included rainfall intensity, surface roughness, hillslope geometry, land use, erosion resistance, and soil hydraulic parameters. The results of this study allow guiding further modelling efforts in the Weiherbach catchment with respect to data collection and model modification.

  12. Institutional Analysis and Ecosystem-Based Management: The Institutional Analysis and Development Framework.

    Science.gov (United States)

    Imperial

    1999-11-01

    / Scholars, government practitioners, and environmentalists are increasingly supportive of collaborative, ecosystem-based approaches to natural resource management. However, few researchers have focused their attention on examining the important administrative and institutional challenges surrounding ecosystem-based management. This paper describes how the institutional analysis and development (IAD) framework can be used to better understand the institutional arrangements used to implement ecosystem-based management programs. Some of the observations emanating from previous research on institutional design and performance are also discussed. The paper's central argument is that if this new resource management paradigm is to take hold and flourish, researchers and practitioners must pay closer attention to the questions surrounding institutional design and performance. This should help improve our understanding of the relationship between science and human values in decision making. It should also help researchers avoid making faulty policy recommendations and improve the implementation of ecosystem-based management programs.KEY WORDS: Ecosystem management; Watershed management; Common pool resources; Implementation; Institutional analysis; Evaluation; Policy analysishttp://link.springer-ny.com/link/service/journals/00267/bibs/24n4p449.html

  13. Visual traffic jam analysis based on trajectory data.

    Science.gov (United States)

    Wang, Zuchao; Lu, Min; Yuan, Xiaoru; Zhang, Junping; van de Wetering, Huub

    2013-12-01

    In this work, we present an interactive system for visual analysis of urban traffic congestion based on GPS trajectories. For these trajectories we develop strategies to extract and derive traffic jam information. After cleaning the trajectories, they are matched to a road network. Subsequently, traffic speed on each road segment is computed and traffic jam events are automatically detected. Spatially and temporally related events are concatenated in, so-called, traffic jam propagation graphs. These graphs form a high-level description of a traffic jam and its propagation in time and space. Our system provides multiple views for visually exploring and analyzing the traffic condition of a large city as a whole, on the level of propagation graphs, and on road segment level. Case studies with 24 days of taxi GPS trajectories collected in Beijing demonstrate the effectiveness of our system.

  14. Analysis of valve failures from the NUCLARR data base

    International Nuclear Information System (INIS)

    Moore, L.M.

    1997-11-01

    The Nuclear Computerized Library for Assessing Reactor Reliability (NUCLARR) contains data on component failures with categorical and qualifying information such as component design, normal operating state, system application and safety grade information which is important to the development of risk-based component surveillance testing requirements. This report presents descriptions and results of analyses of valve component failure data and covariate information available in the document Nuclear Computerized Library for Assessing Reactor Reliability Data Manual, Part 3: Hardware Component Failure Data (NUCLARR Data Manual). Although there are substantial records on valve performance, there are many categories of the corresponding descriptors and qualifying information for which specific values are missing. Consequently, this limits the data available for analysis of covariate effects. This report presents cross tabulations by different covariate categories and limited modeling of covariate effects for data subsets with substantive non-missing covariate information

  15. Semantic analysis based forms information retrieval and classification

    Science.gov (United States)

    Saba, Tanzila; Alqahtani, Fatimah Ayidh

    2013-09-01

    Data entry forms are employed in all types of enterprises to collect hundreds of customer's information on daily basis. The information is filled manually by the customers. Hence, it is laborious and time consuming to use human operator to transfer these customers information into computers manually. Additionally, it is expensive and human errors might cause serious flaws. The automatic interpretation of scanned forms has facilitated many real applications from speed and accuracy point of view such as keywords spotting, sorting of postal addresses, script matching and writer identification. This research deals with different strategies to extract customer's information from these scanned forms, interpretation and classification. Accordingly, extracted information is segmented into characters for their classification and finally stored in the forms of records in databases for their further processing. This paper presents a detailed discussion of these semantic based analysis strategies for forms processing. Finally, new directions are also recommended for future research. [Figure not available: see fulltext.

  16. Chromatin Conformation Capture-Based Analysis of Nuclear Architecture.

    Science.gov (United States)

    Grob, Stefan; Grossniklaus, Ueli

    2017-01-01

    Nuclear organization and higher-order chromosome structure in interphase nuclei are thought to have important effects on fundamental biological processes, including chromosome condensation, replication, and transcription. Until recently, however, nuclear organization could only be analyzed microscopically. The development of chromatin conformation capture (3C)-based techniques now allows a detailed look at chromosomal architecture from the level of individual loci to the entire genome. Here we provide a robust Hi-C protocol, allowing the analysis of nuclear organization in nuclei from different wild-type and mutant plant tissues. This method is quantitative and provides a highly efficient and comprehensive way to study chromatin organization during plant development, in response to different environmental stimuli, and in mutants disrupting a variety of processes, including epigenetic pathways regulating gene expression.

  17. Web Pages Content Analysis Using Browser-Based Volunteer Computing

    Directory of Open Access Journals (Sweden)

    Wojciech Turek

    2013-01-01

    Full Text Available Existing solutions to the problem of finding valuable information on the Websuffers from several limitations like simplified query languages, out-of-date in-formation or arbitrary results sorting. In this paper a different approach to thisproblem is described. It is based on the idea of distributed processing of Webpages content. To provide sufficient performance, the idea of browser-basedvolunteer computing is utilized, which requires the implementation of text pro-cessing algorithms in JavaScript. In this paper the architecture of Web pagescontent analysis system is presented, details concerning the implementation ofthe system and the text processing algorithms are described and test resultsare provided.

  18. SILAC-based comparative analysis of pathogenic Escherichia coli secretomes

    DEFF Research Database (Denmark)

    Boysen, Anders; Borch, Jonas; Krogh, Thøger Jensen

    2015-01-01

    proteome analysis have the potential to discover both classes of proteins and hence form an important tool for discovering therapeutic targets. Adherent-invasive Escherichia coli (AIEC) and Enterotoxigenic E. coli (ETEC) are pathogenic variants of E. coli which cause intestinal disease in humans. AIEC......-term protection are still needed. In order to identify proteins with therapeutic potential, we have used mass spectrometry-based Stable Isotope Labeling with Amino acids in Cell culture (SILAC) quantitative proteomics method which allows us to compare the proteomes of pathogenic strains to commensal E. coli....... In this study, we grew the pathogenic strains ETEC H10407, AIEC LF82 and the non-pathogenic reference strain E. coli K-12 MG1655 in parallel and used SILAC to compare protein levels in OMVs and culture supernatant. We have identified well-known virulence factors from both AIEC and ETEC, thus validating our...

  19. Image Registration Algorithm Based on Parallax Constraint and Clustering Analysis

    Science.gov (United States)

    Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song

    2018-01-01

    To resolve the problem of slow computation speed and low matching accuracy in image registration, a new image registration algorithm based on parallax constraint and clustering analysis is proposed. Firstly, Harris corner detection algorithm is used to extract the feature points of two images. Secondly, use Normalized Cross Correlation (NCC) function to perform the approximate matching of feature points, and the initial feature pair is obtained. Then, according to the parallax constraint condition, the initial feature pair is preprocessed by K-means clustering algorithm, which is used to remove the feature point pairs with obvious errors in the approximate matching process. Finally, adopt Random Sample Consensus (RANSAC) algorithm to optimize the feature points to obtain the final feature point matching result, and the fast and accurate image registration is realized. The experimental results show that the image registration algorithm proposed in this paper can improve the accuracy of the image matching while ensuring the real-time performance of the algorithm.

  20. Multivariate analysis of eigenvalues and eigenvectors in tensor based morphometry

    Science.gov (United States)

    Rajagopalan, Vidya; Schwartzman, Armin; Hua, Xue; Leow, Alex; Thompson, Paul; Lepore, Natasha

    2015-01-01

    We develop a new algorithm to compute voxel-wise shape differences in tensor-based morphometry (TBM). As in standard TBM, we non-linearly register brain T1-weighed MRI data from a patient and control group to a template, and compute the Jacobian of the deformation fields. In standard TBM, the determinants of the Jacobian matrix at each voxel are statistically compared between the two groups. More recently, a multivariate extension of the statistical analysis involving the deformation tensors derived from the Jacobian matrices has been shown to improve statistical detection power.7 However, multivariate methods comprising large numbers of variables are computationally intensive and may be subject to noise. In addition, the anatomical interpretation of results is sometimes difficult. Here instead, we analyze the eigenvalues and the eigenvectors of the Jacobian matrices. Our method is validated on brain MRI data from Alzheimer's patients and healthy elderly controls from the Alzheimer's Disease Neuro Imaging Database.

  1. Architecture Analysis of an FPGA-Based Hopfield Neural Network

    Directory of Open Access Journals (Sweden)

    Miguel Angelo de Abreu de Sousa

    2014-01-01

    Full Text Available Interconnections between electronic circuits and neural computation have been a strongly researched topic in the machine learning field in order to approach several practical requirements, including decreasing training and operation times in high performance applications and reducing cost, size, and energy consumption for autonomous or embedded developments. Field programmable gate array (FPGA hardware shows some inherent features typically associated with neural networks, such as, parallel processing, modular executions, and dynamic adaptation, and works on different types of FPGA-based neural networks were presented in recent years. This paper aims to address different aspects of architectural characteristics analysis on a Hopfield Neural Network implemented in FPGA, such as maximum operating frequency and chip-area occupancy according to the network capacity. Also, the FPGA implementation methodology, which does not employ multipliers in the architecture developed for the Hopfield neural model, is presented, in detail.

  2. Component-based analysis of embedded control applications

    DEFF Research Database (Denmark)

    Angelov, Christo K.; Guan, Wei; Marian, Nicolae

    2011-01-01

    The widespread use of embedded systems requires the creation of industrial software technology that will make it possible to engineer systems being correct by construction. That can be achieved through the use of validated (trusted) components, verification of design models, and automatic...... instances of reusable, executable components—function blocks (FBs). System actors operate in accordance with a timed multitasking model of computation, whereby I/O signals are exchanged with the controlled plant at precisely specified time instants, resulting in the elimination of I/O jitter. The paper...... a feasible (light-weight) analysis method based on runtime observers. The latter are conceived as special-purpose actors running in parallel with the application actors, while checking system properties specified in Linear Temporal Logic. Observers are configured from reusable FBs that can be exported...

  3. Maintenance management of railway infrastructures based on reliability analysis

    International Nuclear Information System (INIS)

    Macchi, Marco; Garetti, Marco; Centrone, Domenico; Fumagalli, Luca; Piero Pavirani, Gian

    2012-01-01

    Railway infrastructure maintenance plays a crucial role for rail transport. It aims at guaranteeing safety of operations and availability of railway tracks and related equipment for traffic regulation. Moreover, it is one major cost for rail transport operations. Thus, the increased competition in traffic market is asking for maintenance improvement, aiming at the reduction of maintenance expenditures while keeping the safety of operations. This issue is addressed by the methodology presented in the paper. The first step of the methodology consists of a family-based approach for the equipment reliability analysis; its purpose is the identification of families of railway items which can be given the same reliability targets. The second step builds the reliability model of the railway system for identifying the most critical items, given a required service level for the transportation system. The two methods have been implemented and tested in practical case studies, in the context of Rete Ferroviaria Italiana, the Italian public limited company for railway transportation.

  4. Structural Optimization based on the Concept of First Order Analysis

    International Nuclear Information System (INIS)

    Shinji, Nishiwaki; Hidekazu, Nishigaki; Yasuaki, Tsurumi; Yoshio, Kojima; Noboru, Kikuchi

    2002-01-01

    Computer Aided Engineering (CAE) has been successfully utilized in mechanical industries such as the automotive industry. It is, however, difficult for most mechanical design engineers to directly use CAE due to the sophisticated nature of the operations involved. In order to mitigate this problem, a new type of CAE, First Order Analysis (FOA) has been proposed. This paper presents the outcome of research concerning the development of a structural topology optimization methodology within FOA. This optimization method is constructed based on discrete and function-oriented elements such as beam and panel elements, and sequential convex programming. In addition, examples are provided to show the utility of the methodology presented here for mechanical design engineers

  5. Road Network Vulnerability Analysis Based on Improved Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yunpeng Wang

    2014-01-01

    Full Text Available We present an improved ant colony algorithm-based approach to assess the vulnerability of a road network and identify the critical infrastructures. This approach improves computational efficiency and allows for its applications in large-scale road networks. This research involves defining the vulnerability conception, modeling the traffic utility index and the vulnerability of the road network, and identifying the critical infrastructures of the road network. We apply the approach to a simple test road network and a real road network to verify the methodology. The results show that vulnerability is directly related to traffic demand and increases significantly when the demand approaches capacity. The proposed approach reduces the computational burden and may be applied in large-scale road network analysis. It can be used as a decision-supporting tool for identifying critical infrastructures in transportation planning and management.

  6. Viscoelastic Plate Analysis Based on Gâteaux Differential

    Directory of Open Access Journals (Sweden)

    Kadıoğlu Fethi

    2016-01-01

    Full Text Available In this study, it is aimed to analyze the quasi-static response of viscoelastic Kirchhoff plates with mixed finite element formulation based on the Gâteaux differential. Although the static response of elastic plate, beam and shell structures is a widely studied topic, there are few studies that exist in the literature pertaining to the analysis of the viscoelastic structural elements especially with complex geometries, loading conditions and constitutive relations. The developed mixed finite element model in transformed Laplace-Carson space has four unknowns as displacement, bending and twisting moments in addition to the dynamic and geometric boundary condition terms. Four-parameter solid model is employed for modelling the viscoelastic behaviour. For transformation of the solutions obtained in the Laplace-Carson domain to the time domain, different numerical inverse transform techniques are employed. The developed solution technique is applied to several quasi-static example problems for the verification of the suggested numerical procedure.

  7. Choosing a Commercial Broiler Strain Based on Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Hosseini SA

    2014-05-01

    Full Text Available With the complexity and amount of information in a wide variety of comparative performance reports in poultry production, making a decision is difficult. This problem is overcomed only when all data can be put into a common unit. For this purpose, five different decision making analysis approaches including  Maximin, Equally likely, Weighted average, Ordered weighted averages and Technique for order preference by similarity to ideal solution were used to choose the best broiler strain among three ones based on their comparative performance and carcass characteristics. Commercial broiler strains of 6000 designated as R, A, and C (each strain 2000 were randomly allocated into three treatments of five replicates. In this study, all methods showed similar results except Maximin approach. Comparing different methods indicated that strain C with the highest world share market has the best performance followed by strains R and A.

  8. Analysis of Environmental Law Enforcement Mechanism Based on Economic Principle

    Science.gov (United States)

    Cao, Hongjun; Shao, Haohao; Cai, Xuesen

    2017-11-01

    Strengthening and improving the environmental law enforcement mechanism is an important way to protect the ecological environment. This paper is based on economical principles, we did analysis of the marginal management costs by using Pigou means and the marginal transaction costs by using Coase means vary with the quantity growth of pollutant discharge Enterprises. We analyzed all this information, then we got the conclusion as follows. In the process of strengthening the environmental law enforcement mechanism, firstly, we should fully mobilize all aspects of environmental law enforcement, such as legislative bodies and law enforcement agencies, public welfare organizations, television, newspapers, enterprises, people and so on, they need to form a reasonable and organic structure system; then we should use various management means, such as government regulation, legal sanctions, fines, persuasion and denounce, they also need to form an organic structural system.

  9. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  10. Analysis of communication based distributed control of MMC for HVDC

    DEFF Research Database (Denmark)

    Huang, Shaojun; Teodorescu, Remus; Mathe, Laszlo

    2013-01-01

    for high power and high voltage application is a very challenging task. For the reason that distributed control architecture could maintain the modularity of the MMC, this control architecture will be investigated and a distributed control system dedicated for MMC will be proposed in this paper....... The suitable communication technologies, modulation and control techniques for the proposed distributed control system are discussed and compared. Based on the frequency domain modeling and analysis of the distributed control system, the controllers of the different control loops are designed by analytical...... methods and Matlab tools. Finally, sensitiveness of the distributed control system to modulation effect (phase-shifted PWM), communication delay, individual carrier frequency and sampling frequency is studied through simulations that are made in Matlab Simulink and PLECS....

  11. Detection and correction of laser induced breakdown spectroscopy spectral background based on spline interpolation method

    Science.gov (United States)

    Tan, Bing; Huang, Min; Zhu, Qibing; Guo, Ya; Qin, Jianwei

    2017-12-01

    Laser-induced breakdown spectroscopy (LIBS) is an analytical technique that has gained increasing attention because of many applications. The production of continuous background in LIBS is inevitable because of factors associated with laser energy, gate width, time delay, and experimental environment. The continuous background significantly influences the analysis of the spectrum. Researchers have proposed several background correction methods, such as polynomial fitting, Lorenz fitting and model-free methods. However, less of them apply these methods in the field of LIBS Technology, particularly in qualitative and quantitative analyses. This study proposes a method based on spline interpolation for detecting and estimating the continuous background spectrum according to its smooth property characteristic. Experiment on the background correction simulation indicated that, the spline interpolation method acquired the largest signal-to-background ratio (SBR) over polynomial fitting, Lorenz fitting and model-free method after background correction. These background correction methods all acquire larger SBR values than that acquired before background correction (The SBR value before background correction is 10.0992, whereas the SBR values after background correction by spline interpolation, polynomial fitting, Lorentz fitting, and model-free methods are 26.9576, 24.6828, 18.9770, and 25.6273 respectively). After adding random noise with different kinds of signal-to-noise ratio to the spectrum, spline interpolation method acquires large SBR value, whereas polynomial fitting and model-free method obtain low SBR values. All of the background correction methods exhibit improved quantitative results of Cu than those acquired before background correction (The linear correlation coefficient value before background correction is 0.9776. Moreover, the linear correlation coefficient values after background correction using spline interpolation, polynomial fitting, Lorentz

  12. Graph-based iterative Group Analysis enhances microarray interpretation

    Directory of Open Access Journals (Sweden)

    Amtmann Anna

    2004-07-01

    Full Text Available Abstract Background One of the most time-consuming tasks after performing a gene expression experiment is the biological interpretation of the results by identifying physiologically important associations between the differentially expressed genes. A large part of the relevant functional evidence can be represented in the form of graphs, e.g. metabolic and signaling pathways, protein interaction maps, shared GeneOntology annotations, or literature co-citation relations. Such graphs are easily constructed from available genome annotation data. The problem of biological interpretation can then be described as identifying the subgraphs showing the most significant patterns of gene expression. We applied a graph-based extension of our iterative Group Analysis (iGA approach to obtain a statistically rigorous identification of the subgraphs of interest in any evidence graph. Results We validated the Graph-based iterative Group Analysis (GiGA by applying it to the classic yeast diauxic shift experiment of DeRisi et al., using GeneOntology and metabolic network information. GiGA reliably identified and summarized all the biological processes discussed in the original publication. Visualization of the detected subgraphs allowed the convenient exploration of the results. The method also identified several processes that were not presented in the original paper but are of obvious relevance to the yeast starvation response. Conclusions GiGA provides a fast and flexible delimitation of the most interesting areas in a microarray experiment, and leads to a considerable speed-up and improvement of the interpretation process.

  13. Skull base chordomas: analysis of dose-response characteristics

    International Nuclear Information System (INIS)

    Niemierko, Andrzej; Terahara, Atsuro; Goitein, Michael

    1997-01-01

    Objective: To extract dose-response characteristics from dose-volume histograms and corresponding actuarial survival statistics for 115 patients with skull base chordomas. Materials and Methods: We analyzed data for 115 patients with skull base chordoma treated with combined photon and proton conformal radiotherapy to doses in the range 66.6Gy - 79.2Gy. Data set for each patient included gender, histology, age, tumor volume, prescribed dose, overall treatment time, time to recurrence or time to last observation, target dose-volume histogram, and several dosimetric parameters (minimum/mean/median/maximum target dose, percent of the target volume receiving the prescribed dose, dose to 90% of the target volume, and the Equivalent Uniform Dose (EUD). Data were analyzed using the Kaplan-Meier survivor function estimate, the proportional hazards (Cox) model, and parametric modeling of the actuarial probability of recurrence. Parameters of dose-response characteristics were obtained using the maximum likelihood method. Results: Local failure developed in 42 (36%) of patients, with actuarial local control rates at 5 years of 59.2%. The proportional hazards model revealed significant dependence of gender on the probability of recurrence, with female patients having significantly poorer prognosis (hazard ratio of 2.3 with the p value of 0.008). The Wilcoxon and the log-rank tests of the corresponding Kaplan-Meier recurrence-free survival curves confirmed statistical significance of this effect. The Cox model with stratification by gender showed significance of tumor volume (p=0.01), the minimum target dose (p=0.02), and the EUD (p=0.02). Other parameters were not significant at the α level of significance of 0.05, including the prescribed dose (p=0.21). Parametric analysis using a combined model of tumor control probability (to account for non-uniformity of target dose distribution) and the Weibull failure time model (to account for censoring) allowed us to estimate

  14. A Power System Network Splitting Strategy Based on Contingency Analysis

    Directory of Open Access Journals (Sweden)

    Nur Zawani Saharuddin

    2018-02-01

    Full Text Available This paper proposes a network splitting strategy following critical line outages based on N-1 contingency analysis. Network splitting is the best option for certain critical outages when the tendency of severe cascading failures is very high. Network splitting is executed by splitting the power system network into feasible set of islands. Thus, it is essential to identify the optimal splitting solution (in terms of minimal power flow disruption that satisfies certain constraints. This paper determines the optimal splitting solution for each of the critical line outage using discrete evolutionary programming (DEP optimization technique assisted by heuristic initialization approach. Heuristic initialization provides the best initial cutsets which will guide the optimization technique to find the optimal splitting solution. Generation–load balance and transmission line overloading analysis are carried out in each island to ensure the steady state stability is achieved. Load shedding scheme is initiated if the power balance criterion is violated in any island to sustain the generation–load balance. The proposed technique is validated on the IEEE 118 bus system. Results show that the proposed approach produces an optimal splitting solution with lower power flow disruption during network splitting execution.

  15. Validation test case generation based on safety analysis ontology

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Wang, Wen-Shing

    2012-01-01

    Highlights: ► Current practice in validation test case generation for nuclear system is mainly ad hoc. ► This study designs a systematic approach to generate validation test cases from a Safety Analysis Report. ► It is based on a domain-specific ontology. ► Test coverage criteria have been defined and satisfied. ► A computerized toolset has been implemented to assist the proposed approach. - Abstract: Validation tests in the current nuclear industry practice are typically performed in an ad hoc fashion. This study presents a systematic and objective method of generating validation test cases from a Safety Analysis Report (SAR). A domain-specific ontology was designed and used to mark up a SAR; relevant information was then extracted from the marked-up document for use in automatically generating validation test cases that satisfy the proposed test coverage criteria; namely, single parameter coverage, use case coverage, abnormal condition coverage, and scenario coverage. The novelty of this technique is its systematic rather than ad hoc test case generation from a SAR to achieve high test coverage.

  16. Cross-covariance based global dynamic sensitivity analysis

    Science.gov (United States)

    Shi, Yan; Lu, Zhenzhou; Li, Zhao; Wu, Mengmeng

    2018-02-01

    For identifying the cross-covariance source of dynamic output at each time instant for structural system involving both input random variables and stochastic processes, a global dynamic sensitivity (GDS) technique is proposed. The GDS considers the effect of time history inputs on the dynamic output. In the GDS, the cross-covariance decomposition is firstly developed to measure the contribution of the inputs to the output at different time instant, and an integration of the cross-covariance change over the specific time interval is employed to measure the whole contribution of the input to the cross-covariance of output. Then, the GDS main effect indices and the GDS total effect indices can be easily defined after the integration, and they are effective in identifying the important inputs and the non-influential inputs on the cross-covariance of output at each time instant, respectively. The established GDS analysis model has the same form with the classical ANOVA when it degenerates to the static case. After degeneration, the first order partial effect can reflect the individual effects of inputs to the output variance, and the second order partial effect can reflect the interaction effects to the output variance, which illustrates the consistency of the proposed GDS indices and the classical variance-based sensitivity indices. The MCS procedure and the Kriging surrogate method are developed to solve the proposed GDS indices. Several examples are introduced to illustrate the significance of the proposed GDS analysis technique and the effectiveness of the proposed solution.

  17. Gis-Based Surface Analysis of Archaeological Finds

    Science.gov (United States)

    Kovács, K.; Hanke, K.; Moser, M.

    2011-09-01

    The international research project HiMAT (History of Mining Activities in the Tyrol and adjacent areas) is dedicated to the study of mining history in the Eastern Alps by various scientific disciplines. The aim of this program is the analysis of the mining activities' impacts on environment and human societies. Unfortunately, there is only a limited number of specific regions (e.g. Mitterberg) to offer possibilities to investigate the former mining expansions. Within this multidisciplinary project, the archaeological sites and finds are analyzed by the Surveying and Geoinformation Unit at the University of Innsbruck. This paper shows data fusion of different surveying and post-processing methods to achieve a photo-realistic digital 3D model of one of these most important finds, the Bronze Age sluice box from the Mitterberg. The applied workflow consists of four steps: 1. Point cloud processing, 2. Meshing of the point clouds and editing of the models, 3. Image orientation, bundle and image adjustment, 4. Model texturing. In addition, a short range laser scanning survey was organized before the conservation process of this wooden find. More accurate research opportunities were offered after this detailed documentation of the sluice box, for example the reconstruction of the broken parts and the surface analysis of this archaeological object were implemented using these high-resolution datasets. In conclusion, various unperceived patterns of the wooden boards were visualized by the GIS-based tool marks investigation.

  18. GIS-BASED SURFACE ANALYSIS OF ARCHAEOLOGICAL FINDS

    Directory of Open Access Journals (Sweden)

    K. Kovács

    2012-09-01

    Full Text Available The international research project HiMAT (History of Mining Activities in the Tyrol and adjacent areas is dedicated to the study of mining history in the Eastern Alps by various scientific disciplines. The aim of this program is the analysis of the mining activities’ impacts on environment and human societies. Unfortunately, there is only a limited number of specific regions (e.g. Mitterberg to offer possibilities to investigate the former mining expansions. Within this multidisciplinary project, the archaeological sites and finds are analyzed by the Surveying and Geoinformation Unit at the University of Innsbruck. This paper shows data fusion of different surveying and post-processing methods to achieve a photo-realistic digital 3D model of one of these most important finds, the Bronze Age sluice box from the Mitterberg. The applied workflow consists of four steps: 1. Point cloud processing, 2. Meshing of the point clouds and editing of the models, 3. Image orientation, bundle and image adjustment, 4. Model texturing. In addition, a short range laser scanning survey was organized before the conservation process of this wooden find. More accurate research opportunities were offered after this detailed documentation of the sluice box, for example the reconstruction of the broken parts and the surface analysis of this archaeological object were implemented using these high-resolution datasets. In conclusion, various unperceived patterns of the wooden boards were visualized by the GIS-based tool marks investigation.

  19. Phylogenetic relationships of Malassezia species based on multilocus sequence analysis.

    Science.gov (United States)

    Castellá, Gemma; Coutinho, Selene Dall' Acqua; Cabañes, F Javier

    2014-01-01

    Members of the genus Malassezia are lipophilic basidiomycetous yeasts, which are part of the normal cutaneous microbiota of humans and other warm-blooded animals. Currently, this genus consists of 14 species that have been characterized by phenetic and molecular methods. Although several molecular methods have been used to identify and/or differentiate Malassezia species, the sequencing of the rRNA genes and the chitin synthase-2 gene (CHS2) are the most widely employed. There is little information about the β-tubulin gene in the genus Malassezia, a gene has been used for the analysis of complex species groups. The aim of the present study was to sequence a fragment of the β-tubulin gene of Malassezia species and analyze their phylogenetic relationship using a multilocus sequence approach based on two rRNA genes (ITS including 5.8S rRNA and D1/D2 region of 26S rRNA) together with two protein encoding genes (CHS2 and β-tubulin). The phylogenetic study of the partial β-tubulin gene sequences indicated that this molecular marker can be used to assess diversity and identify new species. The multilocus sequence analysis of the four loci provides robust support to delineate species at the terminal nodes and could help to estimate divergence times for the origin and diversification of Malassezia species.

  20. Quantifying neurotransmission reliability through metrics-based information analysis.

    Science.gov (United States)

    Brasselet, Romain; Johansson, Roland S; Arleo, Angelo

    2011-04-01

    We set forth an information-theoretical measure to quantify neurotransmission reliability while taking into full account the metrical properties of the spike train space. This parametric information analysis relies on similarity measures induced by the metrical relations between neural responses as spikes flow in. Thus, in order to assess the entropy, the conditional entropy, and the overall information transfer, this method does not require any a priori decoding algorithm to partition the space into equivalence classes. It therefore allows the optimal parameters of a class of distances to be determined with respect to information transmission. To validate the proposed information-theoretical approach, we study precise temporal decoding of human somatosensory signals recorded using microneurography experiments. For this analysis, we employ a similarity measure based on the Victor-Purpura spike train metrics. We show that with appropriate parameters of this distance, the relative spike times of the mechanoreceptors' responses convey enough information to perform optimal discrimination--defined as maximum metrical information and zero conditional entropy--of 81 distinct stimuli within 40 ms of the first afferent spike. The proposed information-theoretical measure proves to be a suitable generalization of Shannon mutual information in order to consider the metrics of temporal codes explicitly. It allows neurotransmission reliability to be assessed in the presence of large spike train spaces (e.g., neural population codes) with high temporal precision.

  1. Spectral decomposition of asteroid Itokawa based on principal component analysis

    Science.gov (United States)

    Koga, Sumire C.; Sugita, Seiji; Kamata, Shunichi; Ishiguro, Masateru; Hiroi, Takahiro; Tatsumi, Eri; Sasaki, Sho

    2018-01-01

    The heliocentric stratification of asteroid spectral types may hold important information on the early evolution of the Solar System. Asteroid spectral taxonomy is based largely on principal component analysis. However, how the surface properties of asteroids, such as the composition and age, are projected in the principal-component (PC) space is not understood well. We decompose multi-band disk-resolved visible spectra of the Itokawa surface with principal component analysis (PCA) in comparison with main-belt asteroids. The obtained distribution of Itokawa spectra projected in the PC space of main-belt asteroids follows a linear trend linking the Q-type and S-type regions and is consistent with the results of space-weathering experiments on ordinary chondrites and olivine, suggesting that this trend may be a space-weathering-induced spectral evolution track for S-type asteroids. Comparison with space-weathering experiments also yield a short average surface age (component of Itokawa surface spectra is consistent with spectral change due to space weathering and that the spatial variation in the degree of space weathering is very large (a factor of three in surface age), which would strongly suggest the presence of strong regional/local resurfacing process(es) on this small asteroid.

  2. Structural reliability analysis based on the cokriging technique

    International Nuclear Information System (INIS)

    Zhao Wei; Wang Wei; Dai Hongzhe; Xue Guofeng

    2010-01-01

    Approximation methods are widely used in structural reliability analysis because they are simple to create and provide explicit functional relationships between the responses and variables in stead of the implicit limit state function. Recently, the kriging method which is a semi-parameter interpolation technique that can be used for deterministic optimization and structural reliability has gained popularity. However, to fully exploit the kriging method, especially in high-dimensional problems, a large number of sample points should be generated to fill the design space and this can be very expensive and even impractical in practical engineering analysis. Therefore, in this paper, a new method-the cokriging method, which is an extension of kriging, is proposed to calculate the structural reliability. cokriging approximation incorporates secondary information such as the values of the gradients of the function being approximated. This paper explores the use of the cokriging method for structural reliability problems by comparing it with the Kriging method based on some numerical examples. The results indicate that the cokriging procedure described in this work can generate approximation models to improve on the accuracy and efficiency for structural reliability problems and is a viable alternative to the kriging.

  3. Particle-based shape analysis of multi-object complexes.

    Science.gov (United States)

    Cates, Joshua; Fletcher, P Thomas; Styner, Martin; Hazlett, Heather Cody; Whitaker, Ross

    2008-01-01

    This paper presents a new method for optimizing surface point correspondences for shape modeling of multiobject anatomy, or shape complexes. The proposed method is novel in that it optimizes correspondence positions in the full, joint shape space of the object complex. Researchers have previously only considered the correspondence problem separately for each structure, thus ignoring the interstructural shape correlations that are increasingly of interest in many clinical contexts, such as the study of the effects of disease on groups of neuroanatomical structures. The proposed method uses a nonparametric, dynamic particle system to simultaneously sample object surfaces and optimize correspondence point positions. This paper also suggests a principled approach to hypothesis testing using the Hotelling T2 test in the PCA space of the correspondence model, with a simulation-based choice of the number of PCA modes. We also consider statistical analysis of object poses. The modeling and analysis methods are illustrated on brain structure complexes from an ongoing clinical study of pediatric autism.

  4. Frailty phenotypes in the elderly based on cluster analysis

    DEFF Research Database (Denmark)

    Dato, Serena; Montesanto, Alberto; Lagani, Vincenzo

    2012-01-01

    Frailty is a physiological state characterized by the deregulation of multiple physiologic systems of an aging organism determining the loss of homeostatic capacity, which exposes the elderly to disability, diseases, and finally death. An operative definition of frailty, useful for the classifica......Frailty is a physiological state characterized by the deregulation of multiple physiologic systems of an aging organism determining the loss of homeostatic capacity, which exposes the elderly to disability, diseases, and finally death. An operative definition of frailty, useful...... genetic background on the frailty status is still questioned. We investigated the applicability of a cluster analysis approach based on specific geriatric parameters, previously set up and validated in a southern Italian population, to two large longitudinal Danish samples. In both cohorts, we identified...... groups of subjects homogeneous for their frailty status and characterized by different survival patterns. A subsequent survival analysis availing of Accelerated Failure Time models allowed us to formulate an operative index able to correlate classification variables with survival probability. From...

  5. Atomic force microscopy-based shape analysis of heart mitochondria.

    Science.gov (United States)

    Lee, Gi-Ja; Park, Hun-Kuk

    2015-01-01

    Atomic force microscopy (AFM) has become an important medical and biological tool for the noninvasive imaging of cells and biomaterials in medical, biological, and biophysical research. The major advantages of AFM over conventional optical and electron microscopes for bio-imaging include the facts that no special coating is required and that imaging can be done in all environments-air, vacuum, or aqueous conditions. In addition, it can also precisely determine pico-nano Newton force interactions between the probe tip and the sample surface from force-distance curve measurements.It is widely known that mitochondrial swelling is one of the most important indicators of the opening of the mitochondrial permeability transition (MPT) pore. As mitochondrial swelling is an ultrastructural change, quantitative analysis of this change requires high-resolution microscopic methods such as AFM. Here, we describe the use of AFM-based shape analysis for the characterization of nanostructural changes in heart mitochondria resulting from myocardial ischemia-reperfusion injury.

  6. Model Based Safety Analysis with smartIflow †

    Directory of Open Access Journals (Sweden)

    Philipp Hönig

    2017-01-01

    Full Text Available Verification of safety requirements is one important task during the development of safety critical systems. The increasing complexity of systems makes manual analysis almost impossible. This paper introduces a new methodology for formal verification of technical systems with smartIflow (State Machines for Automation of Reliability-related Tasks using Information FLOWs. smartIflow is a new modeling language that has been especially designed for the purpose of automating the safety analysis process in early product life cycle stages. It builds up on experience with existing approaches. As is common practice in current approaches, components are modeled as finite state machines. However, new concepts are introduced to describe component interactions. Events play a major role for internal interactions between components as well as for external (user interactions. Our approach to the verification of formally specified safety requirements is a two-step method. First, an exhaustive simulation creates knowledge about a great variety of possible behaviors of the system, especially including reactions on suddenly occurring (possibly intermittent faults. In the second step, safety requirements specified in CTL (Computation Tree Logic are verified using model checking techniques, and counterexamples are generated if these are not satisfied. The practical applicability of this approach is demonstrated based on a Java implementation using a simple Two-Tank-Pump-Consumer system.

  7. Cnn Based Retinal Image Upscaling Using Zero Component Analysis

    Science.gov (United States)

    Nasonov, A.; Chesnakov, K.; Krylov, A.

    2017-05-01

    The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.

  8. Performance based analysis of hidden beams in reinforced concrete structures

    Directory of Open Access Journals (Sweden)

    Helou Samir H.

    2014-01-01

    Full Text Available Local and perhaps regional vernacular reinforced concrete building construction leans heavily against designing slabs with imbedded hidden beams for flooring systems in most structures including major edifices. The practice is distinctive in both framed and in shear wall structures. Hidden beams are favoured structural elements due to their many inherent features that characterize them; they save on floor height clearance; they also save on formwork, labour and material cost. Moreover, hidden beams form an acceptable aesthetic appearance that does not hinder efficient interior space partitioning. Such beams have the added advantage of clearing the way for horizontal electromechanical ductwork. However, seismic considerations, in all likelihood, are seldom seriously addressed. The mentioned structural system of shallow beams is adopted in ribbed slabs, waffle slabs and at times with solid slabs. Ribbed slabs and waffle slabs are more prone to hidden beam inclusion due to the added effective height of the concrete section. Due to the presence of a relatively high reinforcement ratio at the joints the sections at such location tend to become less ductile with unreliable contribution to spandrel force resistance. In the following study the structural influence of hidden beams within slabs is investigated. With the primary focus on a performance based analysis of such elements within a structure. This is investigated with due attention to shear wall contribution to the overall behaviour of such structures. Numerical results point in the direction that the function of hidden beams is not as adequate as desired. Therefore it is strongly believed that they are generally superfluous and maybe eliminated altogether. Conversely, shallow beams seem to render the overall seismic capacity of the structure unreliable. Since such an argument is rarely manifested within the linear analysis domain; a pushover analysis exercise is thus mandatory for behaviour

  9. Diffraction-analysis-based characterization of very fine gratings

    Science.gov (United States)

    Bischoff, Joerg; Truckenbrodt, Horst; Bauer, Joachim J.

    1997-09-01

    Fine gratings with spatial periods below one micron, either ruled mechanically or patterned holographically, play a key role as encoders in high precision translational or rotational coordinate or measuring machines. Besides, the fast in-line characterization of submicron patterns is a stringent demand in recent microelectronic technology. Thus, a rapid, destruction free and highly accurate measuring technique is required to ensure the quality during manufacturing and for final testing. We propose an optical method which was already successfully introduced in semiconductor industry. Here, the inverse scatter problem inherent in this diffraction based approach is overcome by sophisticated data analysis such as multivariate regression or neural networks. Shortly sketched, the procedure is as follows: certain diffraction efficiencies are measured with an optical angle resolved scatterometer and assigned to a number of profile parameters via data analysis (prediction). Before, the specific measuring model has to be calibrated. If the wavelength-to-period rate is well below unity, it is quite easy to gather enough diffraction orders. However, for gratings with spatial periods being smaller than the probing wavelength, merely the specular reflex will propagate for perpendicular incidence (zero order grating). Consequently, it is virtually impossible to perform a regression analysis. A proper mean to tackle this bottleneck is to record the zero-order reflex as a function of the incident angle. In this paper, the measurement of submicron gratings is discussed with the examples of 0.8, 1.0 and 1.4 micron period resist gratings on silicon, etched silicon oxide on silicon (same periods) and a 512 nm pitch chromium grating on quartz. Using a He-Ne laser with 633 nm wavelength and measuring the direct reflex in both linear polarizations, it is shown that even submicron patterning processes can be monitored and the resulting profiles with linewidths below a half micron can be

  10. Phylogeny and classification of Dickeya based on multilocus sequence analysis.

    Science.gov (United States)

    Marrero, Glorimar; Schneider, Kevin L; Jenkins, Daniel M; Alvarez, Anne M

    2013-09-01

    Bacterial heart rot of pineapple reported in Hawaii in 2003 and reoccurring in 2006 was caused by an undetermined species of Dickeya. Classification of the bacterial strains isolated from infected pineapple to one of the recognized Dickeya species and their phylogenetic relationships with Dickeya were determined by a multilocus sequence analysis (MLSA), based on the partial gene sequences of dnaA, dnaJ, dnaX, gyrB and recN. Individual and concatenated gene phylogenies revealed that the strains form a clade with reference Dickeya sp. isolated from pineapple in Malaysia and are closely related to D. zeae; however, previous DNA-DNA reassociation values suggest that these strains do not meet the genomic threshold for consideration in D. zeae, and require further taxonomic analysis. An analysis of the markers used in this MLSA determined that recN was the best overall marker for resolution of species within Dickeya. Differential intraspecies resolution was observed with the other markers, suggesting that marker selection is important for defining relationships within a clade. Phylogenies produced with gene sequences from the sequenced genomes of strains D. dadantii Ech586, D. dadantii Ech703 and D. zeae Ech1591 did not place the sequenced strains with members of other well-characterized members of their respective species. The average nucleotide identity (ANI) and tetranucleotide frequencies determined for the sequenced strains corroborated the results of the MLSA that D. dadantii Ech586 and D. dadantii Ech703 should be reclassified as Dickeya zeae Ech586 and Dickeya paradisiaca Ech703, respectively, whereas D. zeae Ech1591 should be reclassified as Dickeya chrysanthemi Ech1591.

  11. Consumer’s market analysis of products based on cassava

    Science.gov (United States)

    Unteawati, Bina; Fitriani; Fatih, Cholid

    2018-03-01

    Cassava product has the important role for enhancing household's income in rural. Cassava as raw material food is plentiful as local food in Lampung. Cassava product is one of strategic value addition activities. Value additional activities are a key to create income source enrichment in rural. The household was product cassava as a snack or additional food. Their product cassava was operated in small-scale, traditional, and discontinuous production. They have been lacked in technology, capital, and market access. Measurement the sustainability of their business is important. The market has driven the business globally. This research aims to (1) describe the cassava demand to locally product cassava in rural and (2) analysis the consumer's perception of cassava product. Research take placed in Lampung Province, involved Bandar Lampung and Metro City, Pringsewu, Pesawaran, Central Lampung, and East Lampung district. It is held in February until April 2017. Data were analyzed by descriptive statistic and multidimensional scaling. Based on the analysis conclude that (1) the demand of product cassava from rural was massive in volume and regularity with the enormous transaction. This fact is very important to role business cycles. Consumers demand continuously will lead the production of cassava product sustain. Producers of product cassava will consume fresh cassava for the farmer. Consumption of fresh cassava for home industry regularly in rural will develop balancing in fresh cassava price in the farming gate (2) The consumer's perception on cassava product in the different market showed that they prefer much to consume cassava chips as cassava product products than other. Next are crackers, opak, and tiwul rice. Urban consumers prefer product products as snacks (chips, crumbs, and opak), with consumption frequency of 2-5 times per week and volume of 1-3 kg purchases. Consumers in rural were more frequent with daily consumption frequency. Multidimensional scaling

  12. Ca analysis: an Excel based program for the analysis of intracellular calcium transients including multiple, simultaneous regression analysis.

    Science.gov (United States)

    Greensmith, David J

    2014-01-01

    Here I present an Excel based program for the analysis of intracellular Ca transients recorded using fluorescent indicators. The program can perform all the necessary steps which convert recorded raw voltage changes into meaningful physiological information. The program performs two fundamental processes. (1) It can prepare the raw signal by several methods. (2) It can then be used to analyze the prepared data to provide information such as absolute intracellular Ca levels. Also, the rates of change of Ca can be measured using multiple, simultaneous regression analysis. I demonstrate that this program performs equally well as commercially available software, but has numerous advantages, namely creating a simplified, self-contained analysis workflow. Copyright © 2013 The Author. Published by Elsevier Ireland Ltd.. All rights reserved.

  13. Object-Based Image Analysis in Wetland Research: A Review

    Directory of Open Access Journals (Sweden)

    Iryna Dronova

    2015-05-01

    Full Text Available The applications of object-based image analysis (OBIA in remote sensing studies of wetlands have been growing over recent decades, addressing tasks from detection and delineation of wetland bodies to comprehensive analyses of within-wetland cover types and their change. Compared to pixel-based approaches, OBIA offers several important benefits to wetland analyses related to smoothing of the local noise, incorporating meaningful non-spectral features for class separation and accounting for landscape hierarchy of wetland ecosystem organization and structure. However, there has been little discussion on whether unique challenges of wetland environments can be uniformly addressed by OBIA across different types of data, spatial scales and research objectives, and to what extent technical and conceptual aspects of this framework may themselves present challenges in a complex wetland setting. This review presents a synthesis of 73 studies that applied OBIA to different types of remote sensing data, spatial scale and research objectives. It summarizes the progress and scope of OBIA uses in wetlands, key benefits of this approach, factors related to accuracy and uncertainty in its applications and the main research needs and directions to expand the OBIA capacity in the future wetland studies. Growing demands for higher-accuracy wetland characterization at both regional and local scales together with advances in very high resolution remote sensing and novel tasks in wetland restoration monitoring will likely continue active exploration of the OBIA potential in these diverse and complex environments.

  14. BWR online monitoring system based on noise analysis

    International Nuclear Information System (INIS)

    Ortiz-Villafuerte, Javier; Castillo-Duran, Rogelio; Alonso, Gustavo; Calleros-Micheland, Gabriel

    2006-01-01

    A monitoring system for during operation early detection of an anomaly and/or faulty behavior of equipment and systems related to the dynamics of a boiling water reactor (BWR) has been developed. The monitoring system is based on the analysis of the 'noise' or fluctuations of a signal from a sensor or measurement device. An efficient prime factor algorithm to compute the fast Fourier transform allows the continuous, real-time comparison of the normalized power spectrum density function of the signal against previously stored reference patterns in a continuously evolving matrix. The monitoring system has been successfully tested offline. Four examples of the application of the monitoring system to the detection and diagnostic of faulty equipment behavior are presented in this work: the detection of two different events of partial blockage at the jet pump inlet nozzle, miss-calibration of a recirculation mass flow sensor, and detection of a faulty data acquisition card. The events occurred at the two BWR Units of the Laguna Verde Nuclear Power Plant. The monitoring system and its possible coupling to the data and processing information system of the Laguna Verde Nuclear Power Plant are described. The signal processing methodology is presented along with the introduction of the application of the evolutionary matrix concept for determining the base signature of reactor equipment or component and the detection of off normal operation conditions

  15. Multicriteria Similarity-Based Anomaly Detection Using Pareto Depth Analysis.

    Science.gov (United States)

    Hsiao, Ko-Jen; Xu, Kevin S; Calder, Jeff; Hero, Alfred O

    2016-06-01

    We consider the problem of identifying patterns in a data set that exhibits anomalous behavior, often referred to as anomaly detection. Similarity-based anomaly detection algorithms detect abnormally large amounts of similarity or dissimilarity, e.g., as measured by the nearest neighbor Euclidean distances between a test sample and the training samples. In many application domains, there may not exist a single dissimilarity measure that captures all possible anomalous patterns. In such cases, multiple dissimilarity measures can be defined, including nonmetric measures, and one can test for anomalies by scalarizing using a nonnegative linear combination of them. If the relative importance of the different dissimilarity measures are not known in advance, as in many anomaly detection applications, the anomaly detection algorithm may need to be executed multiple times with different choices of weights in the linear combination. In this paper, we propose a method for similarity-based anomaly detection using a novel multicriteria dissimilarity measure, the Pareto depth. The proposed Pareto depth analysis (PDA) anomaly detection algorithm uses the concept of Pareto optimality to detect anomalies under multiple criteria without having to run an algorithm multiple times with different choices of weights. The proposed PDA approach is provably better than using linear combinations of the criteria, and shows superior performance on experiments with synthetic and real data sets.

  16. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.; Kim, I.S.; Lofgren, E.V.

    1989-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective tests and maintenance practices that control; risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety. 3 refs., 7 figs., 2 tabs

  17. Risk-based configuration control system: Analysis and approaches

    International Nuclear Information System (INIS)

    Samanta, P.K.; Kim, I.S.; Lofgren, E.V.; Vesely, W.E.

    1990-01-01

    This paper presents an analysis of risks associated with component outage configurations during power operation of a nuclear power plant and discusses approaches and strategies for developing a risk-based configuration control system. A configuration, as used here, is a set of component states. The objective of risk-based configuration control is to detect and control plant configurations using a risk-perspective. The configuration contributions to core-melt frequency and core-melt probability are studied for two plants. Large core-melt frequency can be caused by configurations and there are a number of such configurations that are not currently controlled by technical specifications. However, the expected frequency of occurrence of the impacting configurations is small and the actual core-melt probability contributions are also generally small. Effective strategies and criteria for controlling configuration risks are presented. Such control strategies take into consideration the risks associated with configurations, the nature and characteristics of the configuration risks, and also the practical considerations such as adequate repair times and/or options to transfer to low risk configurations. Alternative types of criteria are discussed that are not overly restrictive to result in unnecessary plant shutdown, but rather motivates effective test and maintenance practices that control risk-significant configurations to allow continued operation with an adequate margin to meet challenges to safety

  18. Advanced microgrid design and analysis for forward operating bases

    Science.gov (United States)

    Reasoner, Jonathan

    This thesis takes a holistic approach in creating an improved electric power generation system for a forward operating base (FOB) in the future through the design of an isolated microgrid. After an extensive literature search, this thesis found a need for drastic improvement of the FOB power system. A thorough design process analyzed FOB demand, researched demand side management improvements, evaluated various generation sources and energy storage options, and performed a HOMERRTM discrete optimization to determine the best microgrid design. Further sensitivity analysis was performed to see how changing parameters would affect the outcome. Lastly, this research also looks at some of the challenges which are associated with incorporating a design which relies heavily on inverter-based generation sources, and gives possible solutions to help make a renewable energy powered microgrid a reality. While this thesis uses a FOB as the case study, the process and discussion can be adapted to aide in the design of an off-grid small-scale power grid which utilizes high-penetration levels of renewable energy.

  19. Poka Yoke system based on image analysis and object recognition

    Science.gov (United States)

    Belu, N.; Ionescu, L. M.; Misztal, A.; Mazăre, A.

    2015-11-01

    Poka Yoke is a method of quality management which is related to prevent faults from arising during production processes. It deals with “fail-sating” or “mistake-proofing”. The Poka-yoke concept was generated and developed by Shigeo Shingo for the Toyota Production System. Poka Yoke is used in many fields, especially in monitoring production processes. In many cases, identifying faults in a production process involves a higher cost than necessary cost of disposal. Usually, poke yoke solutions are based on multiple sensors that identify some nonconformities. This means the presence of different equipment (mechanical, electronic) on production line. As a consequence, coupled with the fact that the method itself is an invasive, affecting the production process, would increase its price diagnostics. The bulky machines are the means by which a Poka Yoke system can be implemented become more sophisticated. In this paper we propose a solution for the Poka Yoke system based on image analysis and identification of faults. The solution consists of a module for image acquisition, mid-level processing and an object recognition module using associative memory (Hopfield network type). All are integrated into an embedded system with AD (Analog to Digital) converter and Zync 7000 (22 nm technology).

  20. Mouse-based genetic modeling and analysis of Down syndrome

    Science.gov (United States)

    Xing, Zhuo; Li, Yichen; Pao, Annie; Bennett, Abigail S.; Tycko, Benjamin; Mobley, William C.; Yu, Y. Eugene

    2016-01-01

    Introduction Down syndrome (DS), caused by human trisomy 21 (Ts21), can be considered as a prototypical model for understanding the effects of chromosomal aneuploidies in other diseases. Human chromosome 21 (Hsa21) is syntenically conserved with three regions in the mouse genome. Sources of data A review of recent advances in genetic modeling and analysis of DS. Using Cre/loxP-mediated chromosome engineering, a substantial number of new mouse models of DS have recently been generated, which facilitates better understanding of disease mechanisms in DS. Areas of agreement Based on evolutionary conservation, Ts21 can be modeled by engineered triplication of Hsa21 syntenic regions in mice. The validity of the models is supported by the exhibition of DS-related phenotypes. Areas of controversy Although substantial progress has been made, it remains a challenge to unravel the relative importance of specific candidate genes and molecular mechanisms underlying the various clinical phenotypes. Growing points Further understanding of mechanisms based on data from mouse models, in parallel with human studies, may lead to novel therapies for clinical manifestations of Ts21 and insights to the roles of aneuploidies in other developmental disorders and cancers. PMID:27789459

  1. Nonlinear Resonance Analysis of Slender Portal Frames under Base Excitation

    Directory of Open Access Journals (Sweden)

    Luis Fernando Paullo Muñoz

    2017-01-01

    Full Text Available The dynamic nonlinear response and stability of slender structures in the main resonance regions are a topic of importance in structural analysis. In complex problems, the determination of the response in the frequency domain indirectly obtained through analyses in time domain can lead to huge computational effort in large systems. In nonlinear cases, the response in the frequency domain becomes even more cumbersome because of the possibility of multiple solutions for certain forcing frequencies. Those solutions can be stable and unstable, in particular saddle-node bifurcation at the turning points along the resonance curves. In this work, an incremental technique for direct calculation of the nonlinear response in frequency domain of plane frames subjected to base excitation is proposed. The transformation of equations of motion to the frequency domain is made through the harmonic balance method in conjunction with the Galerkin method. The resulting system of nonlinear equations in terms of the modal amplitudes and forcing frequency is solved by the Newton-Raphson method together with an arc-length procedure to obtain the nonlinear resonance curves. Suitable examples are presented, and the influence of the frame geometric parameters and base motion on the nonlinear resonance curves is investigated.

  2. Performance of Water-Based Liquid Scintillator: An Independent Analysis

    Directory of Open Access Journals (Sweden)

    D. Beznosko

    2014-01-01

    Full Text Available The water-based liquid scintillator (WbLS is a new material currently under development. It is based on the idea of dissolving the organic scintillator in water using special surfactants. This material strives to achieve the novel detection techniques by combining the Cerenkov rings and scintillation light, as well as the total cost reduction compared to pure liquid scintillator (LS. The independent light yield measurement analysis for the light yield measurements using three different proton beam energies (210 MeV, 475 MeV, and 2000 MeV for water, two different WbLS formulations (0.4% and 0.99%, and pure LS conducted at Brookhaven National Laboratory, USA, is presented. The results show that a goal of ~100 optical photons/MeV, indicated by the simulation to be an optimal light yield for observing both the Cerenkov ring and the scintillation light from the proton decay in a large water detector, has been achieved.

  3. IMU-Based Joint Angle Measurement for Gait Analysis

    Directory of Open Access Journals (Sweden)

    Thomas Seel

    2014-04-01

    Full Text Available This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1 joint axis and position identification; and (2 flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  4. Shape modeling and analysis with entropy-based particle systems.

    Science.gov (United States)

    Cates, Joshua; Fletcher, P Thomas; Styner, Martin; Shenton, Martha; Whitaker, Ross

    2007-01-01

    This paper presents a new method for constructing compact statistical point-based models of ensembles of similar shapes that does not rely on any specific surface parameterization. The method requires very little preprocessing or parameter tuning, and is applicable to a wider range of problems than existing methods, including nonmanifold surfaces and objects of arbitrary topology. The proposed method is to construct a point-based sampling of the shape ensemble that simultaneously maximizes both the geometric accuracy and the statistical simplicity of the model. Surface point samples, which also define the shape-to-shape correspondences, are modeled as sets of dynamic particles that are constrained to lie on a set of implicit surfaces. Sample positions are optimized by gradient descent on an energy function that balances the negative entropy of the distribution on each shape with the positive entropy of the ensemble of shapes. We also extend the method with a curvature-adaptive sampling strategy in order to better approximate the geometry of the objects. This paper presents the formulation; several synthetic examples in two and three dimensions; and an application to the statistical shape analysis of the caudate and hippocampus brain structures from two clinical studies.

  5. A Flocking Based algorithm for Document Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Gao, Jinzhu [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Social animals or insects in nature often exhibit a form of emergent collective behavior known as flocking. In this paper, we present a novel Flocking based approach for document clustering analysis. Our Flocking clustering algorithm uses stochastic and heuristic principles discovered from observing bird flocks or fish schools. Unlike other partition clustering algorithm such as K-means, the Flocking based algorithm does not require initial partitional seeds. The algorithm generates a clustering of a given set of data through the embedding of the high-dimensional data items on a two-dimensional grid for easy clustering result retrieval and visualization. Inspired by the self-organized behavior of bird flocks, we represent each document object with a flock boid. The simple local rules followed by each flock boid result in the entire document flock generating complex global behaviors, which eventually result in a clustering of the documents. We evaluate the efficiency of our algorithm with both a synthetic dataset and a real document collection that includes 100 news articles collected from the Internet. Our results show that the Flocking clustering algorithm achieves better performance compared to the K- means and the Ant clustering algorithm for real document clustering.

  6. IMU-based joint angle measurement for gait analysis.

    Science.gov (United States)

    Seel, Thomas; Raisch, Jörg; Schauer, Thomas

    2014-04-16

    This contribution is concerned with joint angle calculation based on inertial measurement data in the context of human motion analysis. Unlike most robotic devices, the human body lacks even surfaces and right angles. Therefore, we focus on methods that avoid assuming certain orientations in which the sensors are mounted with respect to the body segments. After a review of available methods that may cope with this challenge, we present a set of new methods for: (1) joint axis and position identification; and (2) flexion/extension joint angle measurement. In particular, we propose methods that use only gyroscopes and accelerometers and, therefore, do not rely on a homogeneous magnetic field. We provide results from gait trials of a transfemoral amputee in which we compare the inertial measurement unit (IMU)-based methods to an optical 3D motion capture system. Unlike most authors, we place the optical markers on anatomical landmarks instead of attaching them to the IMUs. Root mean square errors of the knee flexion/extension angles are found to be less than 1° on the prosthesis and about 3° on the human leg. For the plantar/dorsiflexion of the ankle, both deviations are about 1°.

  7. Skin injury model classification based on shape vector analysis.

    Science.gov (United States)

    Röhrich, Emil; Thali, Michael; Schweitzer, Wolf

    2012-11-06

    Skin injuries can be crucial in judicial decision making. Forensic experts base their classification on subjective opinions. This study investigates whether known classes of simulated skin injuries are correctly classified statistically based on 3D surface models and derived numerical shape descriptors. Skin injury surface characteristics are simulated with plasticine. Six injury classes - abrasions, incised wounds, gunshot entry wounds, smooth and textured strangulation marks as well as patterned injuries - with 18 instances each are used for a k-fold cross validation with six partitions. Deformed plasticine models are captured with a 3D surface scanner. Mean curvature is estimated for each polygon surface vertex. Subsequently, distance distributions and derived aspect ratios, convex hulls, concentric spheres, hyperbolic points and Fourier transforms are used to generate 1284-dimensional shape vectors. Subsequent descriptor reduction maximizing SNR (signal-to-noise ratio) result in an average of 41 descriptors (varying across k-folds). With non-normal multivariate distribution of heteroskedastic data, requirements for LDA (linear discriminant analysis) are not met. Thus, shrinkage parameters of RDA (regularized discriminant analysis) are optimized yielding a best performance with λ = 0.99 and γ = 0.001. Receiver Operating Characteristic of a descriptive RDA yields an ideal Area Under the Curve of 1.0 for all six categories. Predictive RDA results in an average CRR (correct recognition rate) of 97,22% under a 6 partition k-fold. Adding uniform noise within the range of one standard deviation degrades the average CRR to 71,3%. Digitized 3D surface shape data can be used to automatically classify idealized shape models of simulated skin injuries. Deriving some well established descriptors such as histograms, saddle shape of hyperbolic points or convex hulls with subsequent reduction of dimensionality while maximizing SNR seem to work well for the data at hand, as

  8. Molecular-based recursive partitioning analysis model for glioblastoma in the temozolomide era a correlative analysis based on nrg oncology RTOG 0525

    NARCIS (Netherlands)

    Bell, Erica Hlavin; Pugh, Stephanie L.; McElroy, Joseph P.; Gilbert, Mark R.; Mehta, Minesh; Klimowicz, Alexander C.; Magliocco, Anthony; Bredel, Markus; Robe, Pierre|info:eu-repo/dai/nl/413970957; Grosu, Anca L.; Stupp, Roger; Curran, Walter; Becker, Aline P.; Salavaggione, Andrea L.; Barnholtz-Sloan, Jill S.; Aldape, Kenneth; Blumenthal, Deborah T.; Brown, Paul D.; Glass, Jon; Souhami, Luis; Lee, R. Jeffrey; Brachman, David; Flickinger, John; Won, Minhee; Chakravarti, Arnab

    2017-01-01

    IMPORTANCE: There is a need for a more refined, molecularly based classification model for glioblastoma (GBM) in the temozolomide era. OBJECTIVE: To refine the existing clinically based recursive partitioning analysis (RPA) model by incorporating molecular variables. DESIGN, SETTING, AND

  9. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    Science.gov (United States)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  10. Active Desiccant-Based Preconditioning Market Analysis and Product Development

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, J.

    2001-01-11

    The Phase 1 report (ORNL/Sub/94-SVO44/1), completed earlier in this program, involved a comprehensive field survey and market analysis comparing various specialized outdoor air handling units. This initial investigation included conventional cooling and reheat, conventional cooling with sensible recovery, total energy recovery systems (passive desiccant technology) and various active desiccant systems. The report concluded that several markets do promise a significant sales opportunity for a Climate Changer-based active desiccant system offering. (Climate Changer is a registered trademark of Trane Company.) This initial market analysis defined the wants and needs of the end customers (design engineers and building owners), which, along with subsequent information included in this report, have been used to guide the determination of the most promising active desiccant system configurations. This Phase 2 report begins with a summary of a more thorough investigation of those specific markets identified as most promising for active desiccant systems. Table 1 estimates the annual sales potential for a cost-effective product line of active desiccant systems, such as that built from Climate Changer modules. The Product Development Strategy section describes the active desiccant system configurations chosen to best fit the needs of the marketplace while minimizing system options. Key design objectives based on market research are listed in this report for these active desiccant systems. Corresponding performance goals for the dehumidification wheel required to meet the overall system design objectives are also defined. The Performance Modeling section describes the strategy used by SEMCO to design the dehumidification wheels integrated into the prototype systems currently being tested as part of the U.S. Department of Energy's Advanced Desiccant Technology Program. Actual performance data from wheel testing was used to revise the system performance and energy analysis

  11. TRANSPARENCY IN ELECTRONIC BUSINESS NEGOTIATIONS – EVIDENCE BASED ANALYSIS

    Directory of Open Access Journals (Sweden)

    Radoslav Delina

    2014-12-01

    Full Text Available Purpose: In current economy, where ICT plays a crucial role for being competitive and effective, businesses are facing higher pressures of flexibility and efficiency than ever before. Transparency is often considered as a suitable mechanism for better market prices and more efficient market environment. Electronic business environment provides the possibility to set up more transparent environment and bring higher competitiveness and efficiency on the market. The paper analyse the impact of transparency on prices in e-procurement.Methodology: Reverse auctions are considered as transparent tool simulating in partial level real competition. Together, it allows to examine several levels of transparency set up in auction negotiation process. The impact of transparency on final prices was analysed on real data using relation based analysis were different situations of transparency set up is compared against achieved final price.Findings: Research results based on real data shows, that generally, the transparency in electronic reverse auction can lead to more negative prices agreed by purchasers as current scientific and commercial promotions.Research limitation: Significance of research results is limited due to still low readiness and skills of e-procurers. The validation of results is needed to realized within longer period of time and from environments with different level of e-readiness. Together, it reveal that transparency is more complex issue where the significance of transparency can reveal its sense in some specific situations on the market and negotiation.Value of paper: Evidenced based research reveal some controversy results which support new scientific efforts in microeconomics and socio-economic impact of ICT fields. Together, it affects real practitioners in way how to use and perceive claimed impact of reverse auction solutions.

  12. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    Science.gov (United States)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  13. Comparison of stress-based and strain-based creep failure criteria for severe accident analysis

    International Nuclear Information System (INIS)

    Chavez, S.A.; Kelly, D.L.; Witt, R.J.; Stirn, D.P.

    1995-01-01

    We conducted a parametic analysis of stress-based and strain-based creep failure criteria to determine if there is a significant difference between the two criteria for SA533B vessel steel under severe accident conditions. Parametric variables include debris composition, system pressure, and creep strain histories derived from different testing programs and mathematically fit, with and without tertiary creep. Results indicate significant differences between the two criteria. Stress gradient plays an important role in determining which criterion will predict failure first. Creep failure was not very sensitive to different creep strain histories, except near the transition temperature of the vessel steel (900K to 1000K). Statistical analyses of creep failure data of four independent sources indicate that these data may be pooled, with a spline point at 1000K. We found the Manson-Haferd parameter to have better failure predictive capability than the Larson-Miller parameter for the data studied. (orig.)

  14. Distance Based Root Cause Analysis and Change Impact Analysis of Performance Regressions

    Directory of Open Access Journals (Sweden)

    Junzan Zhou

    2015-01-01

    Full Text Available Performance regression testing is applied to uncover both performance and functional problems of software releases. A performance problem revealed by performance testing can be high response time, low throughput, or even being out of service. Mature performance testing process helps systematically detect software performance problems. However, it is difficult to identify the root cause and evaluate the potential change impact. In this paper, we present an approach leveraging server side logs for identifying root causes of performance problems. Firstly, server side logs are used to recover call tree of each business transaction. We define a novel distance based metric computed from call trees for root cause analysis and apply inverted index from methods to business transactions for change impact analysis. Empirical studies show that our approach can effectively and efficiently help developers diagnose root cause of performance problems.

  15. Feature-Based Analysis of Plasma-Based Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Geddes, Cameron G. R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Min [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cormier-Michel, Estelle [Tech-X Corp., Boulder, CO (United States); Bethel, E. Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-02-01

    Plasma-based particle accelerators can produce and sustain thousands of times stronger acceleration fields than conventional particle accelerators, providing a potential solution to the problem of the growing size and cost of conventional particle accelerators. To facilitate scientific knowledge discovery from the ever growing collections of accelerator simulation data generated by accelerator physicists to investigate next-generation plasma-based particle accelerator designs, we describe a novel approach for automatic detection and classification of particle beams and beam substructures due to temporal differences in the acceleration process, here called acceleration features. The automatic feature detection in combination with a novel visualization tool for fast, intuitive, query-based exploration of acceleration features enables an effective top-down data exploration process, starting from a high-level, feature-based view down to the level of individual particles. We describe the application of our analysis in practice to analyze simulations of single pulse and dual and triple colliding pulse accelerator designs, and to study the formation and evolution of particle beams, to compare substructures of a beam and to investigate transverse particle loss.

  16. Geography-based structural analysis of the Internet

    Energy Technology Data Exchange (ETDEWEB)

    Kasiviswanathan, Shiva [Los Alamos National Laboratory; Eidenbenz, Stephan [Los Alamos National Laboratory; Yan, Guanhua [Los Alamos National Laboratory

    2010-01-01

    In this paper, we study some geographic aspects of the Internet. We base our analysis on a large set of geolocated IP hop-level session data (including about 300,000 backbone routers, 150 million end hosts, and 1 billion sessions) that we synthesized from a variety of different input sources such as US census data, computer usage statistics, Internet market share data, IP geolocation data sets, CAJDA's Skitter data set for backbone connectivity, and BGP routing tables. We use this model to perform a nationwide and statewide geographic analysis of the Internet. Our main observations are: (1) There is a dominant coast-to-coast pattern in the US Internet traffic. In fact, in many instances even if the end-devices are not near either coast, still the traffic between them takes a long detour through the coasts. (2) More than half of the Internet paths are inflated by 100% or more compared to their corresponding geometric straight-line distance. This circuitousness makes the average ratio between the routing distance and geometric distance big (around 10). (3) The weighted mean hop count is around 5, but the hop counts are very loosely correlated with the distances. The weighted mean AS count (number of ASes traversed) is around 3. (4) The AS size and the AS location number distributions are heavy-tailed and strongly correlated. Most of the ASes are medium sized and there is a wide variability in the geographic dispersion size (measured in terms of the convex hull area) of these ASes.

  17. Design of process displays based on risk analysis techniques

    International Nuclear Information System (INIS)

    Lundtang Paulsen, J.

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  18. Evaluation of nuclear reactor based activation analysis techniques

    International Nuclear Information System (INIS)

    Obrusnik, I.; Kucera, J.

    1977-09-01

    A survey is presented of the basic types of activation analysis applied in environmental control. Reactor neutron activation analysis is described (including the reactor as a neutron source, sample activation in the reactor, methodology of neutron activation analysis, sample transport into the reactor and sample packaging after irradiation, instrumental activation analysis with radiochemical separation, data measurement and evaluation, sampling and sample preparation). Sources of environmental contamination with trace elements, sampling and sample analysis by neutron activation are described. The analysis is described of soils, waters and biological materials. Methods are shown of evaluating neutron activation analysis results and of their interpretation for purposes of environmental control. (J.B.)

  19. Structural Decomposition Analysis of China’s Industrial Energy Consumption Based on Input-Output Analysis

    Science.gov (United States)

    Huang, X. Y.; Zhou, J. Q.; Wang, Z.; Deng, L. C.; Hong, S.

    2017-05-01

    China is now at a stage of accelerated industrialization and urbanization, with energy-intensive industries contributing a large proportion of economic growth. In this study, we examined industrial energy consumption by decomposition analysis to describe the driving factors of energy consumption in China. Based on input-output (I-O) tables from the World Input-Output Database (WIOD) website and China’s energy use data from 1995 to 2011, we studied the sectorial changes of energy efficiency during the examined period. The results showed that all industries increased their energy efficiency. Energy consumption was decomposed into three factors by the logarithmic mean Divisia index (LMDI) method. The increase in production output was the leading factor that drives up China’s energy consumption. World Trade Organization accession and financial crises had great impact on the energy consumption. Based on these results, a series of energy policy suggestions for decision-makers has been proposed.

  20. BMP analysis system for watershed-based stormwater management.

    Science.gov (United States)

    Zhen, Jenny; Shoemaker, Leslie; Riverson, John; Alvi, Khalid; Cheng, Mow-Soung

    2006-01-01

    Best Management Practices (BMPs) are measures for mitigating nonpoint source (NPS) pollution caused mainly by stormwater runoff. Established urban and newly developing areas must develop cost effective means for restoring or minimizing impacts, and planning future growth. Prince George's County in Maryland, USA, a fast-growing region in the Washington, DC metropolitan area, has developed a number of tools to support analysis and decision making for stormwater management planning and design at the watershed level. These tools support watershed analysis, innovative BMPs, and optimization. Application of these tools can help achieve environmental goals and lead to significant cost savings. This project includes software development that utilizes GIS information and technology, integrates BMP processes simulation models, and applies system optimization techniques for BMP planning and selection. The system employs the ESRI ArcGIS as the platform, and provides GIS-based visualization and support for developing networks including sequences of land uses, BMPs, and stream reaches. The system also provides interfaces for BMP placement, BMP attribute data input, and decision optimization management. The system includes a stand-alone BMP simulation and evaluation module, which complements both research and regulatory nonpoint source control assessment efforts, and allows flexibility in the examining various BMP design alternatives. Process based simulation of BMPs provides a technique that is sensitive to local climate and rainfall patterns. The system incorporates a meta-heuristic optimization technique to find the most cost-effective BMP placement and implementation plan given a control target, or a fixed cost. A case study is presented to demonstrate the application of the Prince George's County system. The case study involves a highly urbanized area in the Anacostia River (a tributary to Potomac River) watershed southeast of Washington, DC. An innovative system of

  1. Best hits of 11110110111: model-free selection and parameter-free sensitivity calculation of spaced seeds.

    Science.gov (United States)

    Noé, Laurent

    2017-01-01

    Spaced seeds , also named gapped q-grams, gapped k-mers, spaced q-grams , have been proven to be more sensitive than contiguous seeds ( contiguous q-grams, contiguous k-mers ) in nucleic and amino-acid sequences analysis. Initially proposed to detect sequence similarities and to anchor sequence alignments, spaced seeds have more recently been applied in several alignment-free related methods. Unfortunately, spaced seeds need to be initially designed. This task is known to be time-consuming due to the number of spaced seed candidates. Moreover, it can be altered by a set of arbitrary chosen parameters from the probabilistic alignment models used. In this general context, Dominant seeds have been introduced by Mak and Benson (Bioinformatics 25:302-308, 2009) on the Bernoulli model, in order to reduce the number of spaced seed candidates that are further processed in a parameter-free calculation of the sensitivity. We expand the scope of work of Mak and Benson on single and multiple seeds by considering the Hit Integration model of Chung and Park (BMC Bioinform 11:31, 2010), demonstrate that the same dominance definition can be applied, and that a parameter-free study can be performed without any significant additional cost. We also consider two new discrete models, namely the Heaviside and the Dirac models, where lossless seeds can be integrated. From a theoretical standpoint, we establish a generic framework on all the proposed models, by applying a counting semi-ring to quickly compute large polynomial coefficients needed by the dominance filter. From a practical standpoint, we confirm that dominant seeds reduce the set of, either single seeds to thoroughly analyse, or multiple seeds to store. Moreover, in http://bioinfo.cristal.univ-lille.fr/yass/iedera_dominance, we provide a full list of spaced seeds computed on the four aforementioned models, with one (continuous) parameter left free for each model, and with several (discrete) alignment lengths.

  2. Forecasting the density of oil futures returns using model-free implied volatility and high-frequency data

    International Nuclear Information System (INIS)

    Ielpo, Florian; Sevi, Benoit

    2013-09-01

    Forecasting the density of returns is useful for many purposes in finance, such as risk management activities, portfolio choice or derivative security pricing. Existing methods to forecast the density of returns either use prices of the asset of interest or option prices on this same asset. The latter method needs to convert the risk-neutral estimate of the density into a physical measure, which is computationally cumbersome. In this paper, we take the view of a practitioner who observes the implied volatility under the form of an index, namely the recent OVX, to forecast the density of oil futures returns for horizons going from 1 to 60 days. Using the recent methodology in Maheu and McCurdy (2011) to compute density predictions, we compare the performance of time series models using implied volatility and either daily or intra-daily futures prices. Our results indicate that models based on implied volatility deliver significantly better density forecasts at all horizons, which is in line with numerous studies delivering the same evidence for volatility point forecast. (authors)

  3. SAFETY-BASED CAPACITY ANALYSIS FOR CHINESE HIGHWAYS

    Directory of Open Access Journals (Sweden)

    Ping YI, Ph.D.

    2004-01-01

    Full Text Available Many years of research have led to the development of theories and methodologies in roadway capacity analysis in the developed countries. However, those resources coexist with roadway design and traffic control practices in the local country, and cannot be simply transferred to China for applications. For example, the Highway Capacity Manual in the United State describes roadway capacity under ideal conditions and estimates practical capacities under prevailing conditions in the field. This capacity and the conditions for change are expected to be different on Chinese roadways as the local roadway design (lane width, curves and grades, vehicle size, and traffic mix are different. This research looks into an approach to the capacity issue different from the Highway Capacity Manual. According to the car-following principle, this paper first describes the safety criteria that affect traffic operations. Several speed schemes are subsequently discussed as they are affected by the maximum speed achievable under the local conditions. The study has shown that the effect of geometric and traffic conditions can be effectually reflected in the maximum speed adopted by the drivers. For most Chinese highways without a posted speed limit, the choice of speed by the drivers from the safety prospective is believed to have incorporated considerations of the practical driving conditions. Based on this, a condition for capacity calculation is obtained by comparing the desired vs. safety-based distance headways. The formulations of the model are mathematically sound and physically meaningful, and preliminary testing of the model is encouraging. Future research includes field data acquisition for calibration and adjustment, and model testing on Chinese highways.

  4. Survey of sampling-based methods for uncertainty and sensitivity analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jay Dean; Helton, Jon Craig; Sallaberry, Cedric J. PhD. (.; .); Storlie, Curt B. (Colorado State University, Fort Collins, CO)

    2006-06-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (1) Definition of probability distributions to characterize epistemic uncertainty in analysis inputs, (2) Generation of samples from uncertain analysis inputs, (3) Propagation of sampled inputs through an analysis, (4) Presentation of uncertainty analysis results, and (5) Determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition.

  5. Survey of sampling-based methods for uncertainty and sensitivity analysis

    International Nuclear Information System (INIS)

    Helton, J.C.; Johnson, J.D.; Sallaberry, C.J.; Storlie, C.B.

    2006-01-01

    Sampling-based methods for uncertainty and sensitivity analysis are reviewed. The following topics are considered: (i) definition of probability distributions to characterize epistemic uncertainty in analysis inputs (ii) generation of samples from uncertain analysis inputs (iii) propagation of sampled inputs through an analysis (iv) presentation of uncertainty analysis results, and (v) determination of sensitivity analysis results. Special attention is given to the determination of sensitivity analysis results, with brief descriptions and illustrations given for the following procedures/techniques: examination of scatterplots, correlation analysis, regression analysis, partial correlation analysis, rank transformations, statistical tests for patterns based on gridding, entropy tests for patterns based on gridding, nonparametric regression analysis, squared rank differences/rank correlation coefficient test, two-dimensional Kolmogorov-Smirnov test, tests for patterns based on distance measures, top down coefficient of concordance, and variance decomposition

  6. Voxel-based morphometry and voxel-based diffusion tensor analysis in amyotrophic lateral sclerosis

    International Nuclear Information System (INIS)

    Chen Zhiye; Ma Lin; Lou Xin; Wang Yan

    2010-01-01

    Objective: To evaluate gray matter volume, white matter volume and FA value changes in amyatrophic lateral sclerosis (ALS) patients by voxel-based morphometry (VBM) and voxel-based diffusion tensor analysis (VBDTA). Methods: Thirty-nine definite or probable ALS patients diagnosed by El Escorial standard and 39 healthy controls were recruited and underwent conventional MR scans and the neuropsychological evaluation. The 3D FSPGR T 1 WI and DTI data were collected on GE Medical 3.0 T MRI system. The 3DT 1 structural images were normalized, segmented and smoothed, and then VBM analysis was performed. DTI data were acquired from 76 healthy controls, and FA map template was made. FA maps generated from the DTI data of ALS patients and healthy controls were normalized to the FA map template for voxel-based analysis. ANCOVA was applied, controlling with age and total intracranial volume for VBM and age for VBDDTA. A statistical threshold of P<0.01 (uncorrected) and cluster level of more than continuous 20 voxels determined significance. Results: Statistical results showed no significant difference in the global volumes of gray matter and white matter, total intracranial volumes and gray matter fraction between ALS patients and healthy controls, but the white matter fraction of ALS patients (0.29 ± 0.02) was significantly less than that of healthy controls (0.30 ± 0.02) statistically (P=0.003). There was significant reduction of gray matter volumes in bilateral superior frontal gyri and precentral gyri, right middle frontal gyrus, right middle and inferior temporal gyrus, left superior occipital gyrus and cuneus and left insula in ALS patients when compared with healthy controls; and the regional reduction of white matter volumes in ALS patients mainly located in genu of corpus callosum, bilateral medial frontal gyri, paracentral lobule and insula, right superior and middle frontal gyrus and left postcentral gyrus. VBDTA showed decrease in FA values in bilateral

  7. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  8. Analysis of Human Mobility Based on Cellular Data

    Science.gov (United States)

    Arifiansyah, F.; Saptawati, G. A. P.

    2017-01-01

    Nowadays not only adult but even teenager and children have then own mobile phones. This phenomena indicates that the mobile phone becomes an important part of everyday’s life. Based on these indication, the amount of cellular data also increased rapidly. Cellular data defined as the data that records communication among mobile phone users. Cellular data is easy to obtain because the telecommunications company had made a record of the data for the billing system of the company. Billing data keeps a log of the users cellular data usage each time. We can obtained information from the data about communication between users. Through data visualization process, an interesting pattern can be seen in the raw cellular data, so that users can obtain prior knowledge to perform data analysis. Cellular data processing can be done using data mining to find out human mobility patterns and on the existing data. In this paper, we use frequent pattern mining and finding association rules to observe the relation between attributes in cellular data and then visualize them. We used weka tools for finding the rules in stage of data mining. Generally, the utilization of cellular data can provide supporting information for the decision making process and become a data support to provide solutions and information needed by the decision makers.

  9. Crater ejecta scaling laws: fundamental forms based on dimensional analysis

    International Nuclear Information System (INIS)

    Housen, K.R.; Schmidt, R.M.; Holsapple, K.A.

    1983-01-01

    A model of crater ejecta is constructed using dimensional analysis and a recently developed theory of energy and momentum coupling in cratering events. General relations are derived that provide a rationale for scaling laboratory measurements of ejecta to larger events. Specific expressions are presented for ejection velocities and ejecta blanket profiles in two limiting regimes of crater formation: the so-called gravity and strength regimes. In the gravity regime, ejectra velocities at geometrically similar launch points within craters vary as the square root of the product of crater radius and gravity. This relation implies geometric similarity of ejecta blankets. That is, the thickness of an ejecta blanket as a function of distance from the crater center is the same for all sizes of craters if the thickness and range are expressed in terms of crater radii. In the strength regime, ejecta velocities are independent of crater size. Consequently, ejecta blankets are not geometrically similar in this regime. For points away from the crater rim the expressions for ejecta velocities and thickness take the form of power laws. The exponents in these power laws are functions of an exponent, α, that appears in crater radius scaling relations. Thus experimental studies of the dependence of crater radius on impact conditions determine scaling relations for ejecta. Predicted ejection velocities and ejecta-blanket profiles, based on measured values of α, are compared to existing measurements of velocities and debris profiles

  10. Wavelet packet-based insufficiency murmurs analysis method

    Science.gov (United States)

    Choi, Samjin; Jiang, Zhongwei

    2007-12-01

    In this paper, the aortic and mitral insufficiency murmurs analysis method using the wavelet packet technique is proposed for classifying the valvular heart defects. Considering the different frequency distributions between the normal sound and insufficiency murmurs in frequency domain, we used two properties such as the relative wavelet energy and the Shannon wavelet entropy which described the energy information and the entropy information at the selected frequency band, respectively. Then, the signal to murmur ratio (SMR) measures which could mean the ratio between the frequency bands for normal heart sounds and for aortic and mitral insufficiency murmurs allocated to 15.62-187.50 Hz and 187.50-703.12 Hz respectively, were employed as a classification manner to identify insufficiency murmurs. The proposed measures were validated by some case studies. The 194 heart sound signals with 48 normal and 146 abnormal sound cases acquired from 6 healthy volunteers and 30 patients were tested. The normal sound signals recorded by applying a self-produced wireless electric stethoscope system to subjects with no history of other heart complications were used. Insufficiency murmurs were grouped into two valvular heart defects such as aortic insufficiency and mitral insufficiency. These murmur subjects included no other coexistent valvular defects. As a result, the proposed insufficiency murmurs detection method showed relatively very high classification efficiency. Therefore, the proposed heart sound classification method based on the wavelet packet was validated for the classification of valvular heart defects, especially insufficiency murmurs.

  11. Analysis of trait-based models in marine ecosystems

    DEFF Research Database (Denmark)

    Heilmann, Irene Louise Torpe

    -temporal pattern formation in a predator–prey system where animals move towards higher fitness. Reaction-diffusion systems have been used extensively to describe spatio-temporal patterns in a variety of systems. However, animals rarely move completely at random, as expressed by diffusion. This has lead to models...... with taxis terms, describing individuals moving in the direction of an attractant. An example is chemotaxis models, where bacteria are attracted to a chemical substance. From an evolutionary perspective, it is expected that animals act as to optimize their fitness. Based on this principle, a predator......–prey system with fitness taxis and diffusion is proposed. Here, fitness taxis refer to animals moving towards higher values of fitness, and the specific growth rates of the populations are used as a measure of the fitness values. To determine the conditions for pattern formation, a linear stability analysis...

  12. Residual Stress Analysis Based on Acoustic and Optical Methods

    Directory of Open Access Journals (Sweden)

    Sanichiro Yoshida

    2016-02-01

    Full Text Available Co-application of acoustoelasticity and optical interferometry to residual stress analysis is discussed. The underlying idea is to combine the advantages of both methods. Acoustoelasticity is capable of evaluating a residual stress absolutely but it is a single point measurement. Optical interferometry is able to measure deformation yielding two-dimensional, full-field data, but it is not suitable for absolute evaluation of residual stresses. By theoretically relating the deformation data to residual stresses, and calibrating it with absolute residual stress evaluated at a reference point, it is possible to measure residual stresses quantitatively, nondestructively and two-dimensionally. The feasibility of the idea has been tested with a butt-jointed dissimilar plate specimen. A steel plate 18.5 mm wide, 50 mm long and 3.37 mm thick is braze-jointed to a cemented carbide plate of the same dimension along the 18.5 mm-side. Acoustoelasticity evaluates the elastic modulus at reference points via acoustic velocity measurement. A tensile load is applied to the specimen at a constant pulling rate in a stress range substantially lower than the yield stress. Optical interferometry measures the resulting acceleration field. Based on the theory of harmonic oscillation, the acceleration field is correlated to compressive and tensile residual stresses qualitatively. The acoustic and optical results show reasonable agreement in the compressive and tensile residual stresses, indicating the feasibility of the idea.

  13. Behavior Analysis Based on Coordinates of Body Tags

    Science.gov (United States)

    Luštrek, Mitja; Kaluža, Boštjan; Dovgan, Erik; Pogorelc, Bogdan; Gams, Matjaž

    This paper describes fall detection, activity recognition and the detection of anomalous gait in the Confidence project. The project aims to prolong the independence of the elderly by detecting falls and other types of behavior indicating a health problem. The behavior will be analyzed based on the coordinates of tags worn on the body. The coordinates will be detected with radio sensors. We describe two Confidence modules. The first one classifies the user's activity into one of six classes, including falling. The second one detects walking anomalies, such as limping, dizziness and hemiplegia. The walking analysis can automatically adapt to each person by using only the examples of normal walking of that person. Both modules employ machine learning: the paper focuses on the features they use and the effect of tag placement and sensor noise on the classification accuracy. Four tags were enough for activity recognition accuracy of over 93% at moderate sensor noise, while six were needed to detect walking anomalies with the accuracy of over 90%.

  14. Defining Smart City. A Conceptual Framework Based on Keyword Analysis

    Directory of Open Access Journals (Sweden)

    Farnaz Mosannenzadeh

    2014-05-01

    Full Text Available “Smart city” is a concept that has been the subject of increasing attention in urban planning and governance during recent years. The first step to create Smart Cities is to understand its concept. However, a brief review of literature shows that the concept of Smart City is the subject of controversy. Thus, the main purpose of this paper is to provide a conceptual framework to define Smart City. To this aim, an extensive literature review was done. Then, a keyword analysis on literature was held against main research questions (why, what, who, when, where, how and based on three main domains involved in the policy decision making process and Smart City plan development: Academic, Industrial and Governmental. This resulted in a conceptual framework for Smart City. The result clarifies the definition of Smart City, while providing a framework to define Smart City’s each sub-system. Moreover, urban authorities can apply this framework in Smart City initiatives in order to recognize their main goals, main components, and key stakeholders.

  15. Meta-analysis and Evidence-Based Psychosocial Intervention

    Directory of Open Access Journals (Sweden)

    Julio Sánchez-Meca

    2011-04-01

    Full Text Available Psychosocial interventions that are applied in practice should be those that have received the best scientific evidence about their effectiveness. Evidence-Based Psychosocial Intervention is a methodological tool that aims to raise awareness among professionals and policy makers of the need for professional practice to be guided by the best evidence. For this purpose, systematic reviews and meta-analyses of empirical evaluation studies play an important role as they allow us to synthesize the results of numerous studies on the same issue to determine which are the best treatments and interventions for solving the problem. This article presents an overview of the meta-analyses and the information they can provide for professional practice. The phases in which a meta-analysis is carried out are outlined as follows: (a formulating the problem, (b searching for the studies, (c coding the studies, (d calculating the effect size, (e statistical techniques of integration and (f publishing the study. The scope of meta-analyses and their results are illustrated with an example and their implications for professional practice are discussed.

  16. Life-cycle analysis of bio-based aviation fuels.

    Science.gov (United States)

    Han, Jeongwoo; Elgowainy, Amgad; Cai, Hao; Wang, Michael Q

    2013-12-01

    Well-to-wake (WTWa) analysis of bio-based aviation fuels, including hydroprocessed renewable jet (HRJ) from various oil seeds, Fischer-Tropsch jet (FTJ) from corn-stover and co-feeding of coal and corn-stover, and pyrolysis jet from corn stover, is conducted and compared with petroleum jet. WTWa GHG emission reductions relative to petroleum jet can be 41-63% for HRJ, 68-76% for pyrolysis jet and 89% for FTJ from corn stover. The HRJ production stage dominates WTWa GHG emissions from HRJ pathways. The differences in GHG emissions from HRJ production stage among considered feedstocks are much smaller than those from fertilizer use and N2O emissions related to feedstock collection stage. Sensitivity analyses on FTJ production from coal and corn-stover are also conducted, showing the importance of biomass share in the feedstock, carbon capture and sequestration options, and overall efficiency. For both HRJ and FTJ, co-product handling methods have significant impacts on WTWa results. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Content-Based Analysis of Bumper Stickers in Jordan

    Directory of Open Access Journals (Sweden)

    Abdullah A. Jaradat

    2016-12-01

    Full Text Available This study has set out to investigate bumper stickers in Jordan focusing mainly on the themes of the stickers. The study hypothesized that bumper stickers in Jordan reflect a wide range of topics including social, economic, and political. Due to being the first study of this phenomenon, the study has adopted content-based analysis to determine the basic topics. The study has found that the purpose of most bumper sticker is fun and humor; most of them are not serious and do not carry any biting messages. They do not present any criticism to the most dominant problems at the level of society including racism, nepotism, anti-feminism, inflation, high taxes, and refugees. Another finding is that politics is still a taboo; no political bumper stickers were found in Jordan. Finally, the themes the stickers targeted are: lessons of life 28.85 %; challenging or warning other drivers 16%; funny notes about social issues 12%; religious sayings 8%; treating the car as a female 7%; the low economic status of the driver 6%; love and treachery 5.5%; the prestigious status of the car 5%; envy 4%; nicknames for the car or the driver 4%; irony 3 %; and English sayings 1.5 %. Keywords: bumper stickers, themes, politics

  18. An Analysis of Excavation Support Safety Based on Experimental Studies

    Directory of Open Access Journals (Sweden)

    Gorska Karolina

    2015-09-01

    Full Text Available The article presents the results of inclinometric measurements and numerical analyses of soldier-pile wall displacements. The excavation under investigation was made in cohesive soils. The measurements were conducted at points located at the edge of the cantilever excavation support system. The displacements of the excavation support observed over the period of three years demonstrated the pattern of steady growth over the first two months, followed by a gradual levelling out to a final plateau. The numerical analyses were conducted based on 3D FEM models. The numerical analysis of the problem comprise calculations of the global structural safety factor depending on the displacement of the chosen points in the lagging and conducted by means of the φ/c reduction procedure. The adopted graphical method of safety estimation is very conservative in the sense that it recognizes stability loss quite early, when one could further load the medium or weaken it by further strength reduction. The values of the Msf factor are relatively high. This is caused by the fact that the structure was designed for excavation twice as deep. Nevertheless, the structure is treated as a temporary one.

  19. An Analysis of Excavation Support Safety Based on Experimental Studies

    Science.gov (United States)

    Gorska, Karolina; Wyjadłowski, Marek

    2015-09-01

    The article presents the results of inclinometric measurements and numerical analyses of soldier-pile wall displacements. The excavation under investigation was made in cohesive soils. The measurements were conducted at points located at the edge of the cantilever excavation support system. The displacements of the excavation support observed over the period of three years demonstrated the pattern of steady growth over the first two months, followed by a gradual levelling out to a final plateau. The numerical analyses were conducted based on 3D FEM models. The numerical analysis of the problem comprise calculations of the global structural safety factor depending on the displacement of the chosen points in the lagging and conducted by means of the φ/c reduction procedure. The adopted graphical method of safety estimation is very conservative in the sense that it recognizes stability loss quite early, when one could further load the medium or weaken it by further strength reduction. The values of the Msf factor are relatively high. This is caused by the fact that the structure was designed for excavation twice as deep. Nevertheless, the structure is treated as a temporary one.

  20. Satellite Based Analysis of Surface Urban Heat Island Intensity

    Directory of Open Access Journals (Sweden)

    Gémes Orsolya

    2016-06-01

    Full Text Available The most obvious characteristics of urban climate are higher air and surface temperatures compared to rural areas and large spatial variation of meteorological parameters within the city. This research examines the long term and seasonal development of urban surface temperature using satellite data during a period of 30 years and within a year. The medium resolution Landsat data were (preprocessed using open source tools. Besides the analysis of the long term and seasonal changes in land surface temperature within a city, also its relationship with changes in the vegetation cover was investigated. Different urban districts and local climate zones showed varying strength of correlation. The temperature difference between urban surfaces and surroundings is defined as surface urban heat island (SUHI. Its development shows remarkable seasonal and spatial anomalies. The satellite images can be applied to visualize and analyze the SUHI, although they were not collected at midday and early afternoon, when the phenomenon is normally at its maximum. The applied methodology is based on free data and software and requires minimal user interaction. Using the results new urban developments (new built up and green areas can be planned, that help mitigate the negative effects of urban climate.

  1. Mindfulness-based therapy: a comprehensive meta-analysis.

    Science.gov (United States)

    Khoury, Bassam; Lecomte, Tania; Fortin, Guillaume; Masse, Marjolaine; Therien, Phillip; Bouchard, Vanessa; Chapleau, Marie-Andrée; Paquin, Karine; Hofmann, Stefan G

    2013-08-01

    Mindfulness-based therapy (MBT) has become a popular form of intervention. However, the existing reviews report inconsistent findings. To clarify these inconsistencies in the literature, we conducted a comprehensive effect-size analysis to evaluate the efficacy of MBT. A systematic review of studies published in journals or in dissertations in PubMED or PsycINFO from the first available date until May 10, 2013. A total of 209 studies (n=12,145) were included. Effect-size estimates suggested that MBT is moderately effective in pre-post comparisons (n=72; Hedge's g=.55), in comparisons with waitlist controls (n=67; Hedge's g=.53), and when compared with other active treatments (n=68; Hedge's g=.33), including other psychological treatments (n=35; Hedge's g=.22). MBT did not differ from traditional CBT or behavioral therapies (n=9; Hedge's g=-.07) or pharmacological treatments (n=3; Hedge's g=.13). MBT is an effective treatment for a variety of psychological problems, and is especially effective for reducing anxiety, depression, and stress. Copyright © 2013 Elsevier Ltd. All rights reserved.

  2. Analysis of Verbal Interactions in Problem-based Learning.

    Science.gov (United States)

    Oh, Sun A; Chung, Eun Kyung; Woo, Young Jong; Han, Eui Ryoung; Kim, Young Ok

    2010-06-01

    Problem-based learning (PBL) is a constructive learning environment that solves ill-structured problems through collaborative learning. The purpose of this study was to analyze the interaction of students and a tutor in a small-group PBL discussion. This study examined how the types of interactions are composed over the meeting. Fourteen third-year subjects from Chonnam National University Medical School, Korea formed two tutorial groups. Two tutorial sessions were videotaped and analyzed. All videotapes were transcribed to analyze the interaction type. The criteria of interaction analysis were learning-oriented interaction (exploratory questioning, cumulative reasoning, handling conflicts about the knowledge), procedural interactions, and irrelevant task interactions. Nearly all discourses between tutors and students were learning-oriented interactions. The results showed that students spent more time on cumulative reasoning. In contrast, tutors implemented more exploratory questioning. Little time was spent on handling conflicts about knowledge and procedural and irrelevant/off-task interactions. To improve critical thinking and problem-solving competence in PBL, we should consider various efforts to encourage discussion about conflicting knowledge. A PBL tutor training program should be provided to facilitate PBL group discussions.

  3. Quantitative video-based gait pattern analysis for hemiparkinsonian rats.

    Science.gov (United States)

    Lee, Hsiao-Yu; Hsieh, Tsung-Hsun; Liang, Jen-I; Yeh, Ming-Long; Chen, Jia-Jin J

    2012-09-01

    Gait disturbances are common in the rat model of Parkinson's disease (PD) by administrating 6-hydroxydopamine. However, few studies have simultaneously assessed spatiotemporal gait indices and the kinematic information of PD rats during overground locomotion. This study utilized a simple, accurate, and reproducible method for quantifying the spatiotemporal and kinematic changes of gait patterns in hemiparkinsonian rats. A transparent walkway with a tilted mirror was set to capture underview footprints and lateral joint ankle images using a high-speed and high-resolution digital camera. The footprint images were semi-automatically processed with a threshold setting to identify the boundaries of soles and the critical points of each hindlimb for deriving the spatiotemporal and kinematic indices of gait. Following PD lesion, asymmetrical gait patterns including a significant decrease in the step/stride length and increases in the base of support and ankle joint angle were found. The increased footprint length, toe spread, and intermediary toe spread were found, indicating a compensatory gait pattern for impaired locomotor function. The temporal indices showed a significant decrease in the walking speed with increased durations of the stance/swing phase and double support time, which was more evident in the affected hindlimb. Furthermore, the ankle kinematic data showed that the joint angle decreased at the toe contact stage. We conclude that the proposed gait analysis method can be used to precisely detect locomotor function changes in PD rats, which is useful for objective assessments of investigating novel treatments for PD animal model.

  4. Modeling Chinese ionospheric layer parameters based on EOF analysis

    Science.gov (United States)

    Yu, You; Wan, Weixing

    2016-04-01

    Using 24-ionosonde observations in and around China during the 20th solar cycle, an assimilative model is constructed to map the ionospheric layer parameters (foF2, hmF2, M(3000)F2, and foE) over China based on empirical orthogonal function (EOF) analysis. First, we decompose the background maps from the International Reference Ionosphere model 2007 (IRI-07) into different EOF modes. The obtained EOF modes consist of two factors: the EOF patterns and the corresponding EOF amplitudes. These two factors individually reflect the spatial distributions (e.g., the latitudinal dependence such as the equatorial ionization anomaly structure and the longitude structure with east-west difference) and temporal variations on different time scales (e.g., solar cycle, annual, semiannual, and diurnal variations) of the layer parameters. Then, the EOF patterns and long-term observations of ionosondes are assimilated to get the observed EOF amplitudes, which are further used to construct the Chinese Ionospheric Maps (CIMs) of the layer parameters. In contrast with the IRI-07 model, the mapped CIMs successfully capture the inherent temporal and spatial variations of the ionospheric layer parameters. Finally, comparison of the modeled (EOF and IRI-07 model) and observed values reveals that the EOF model reproduces the observation with smaller root-mean-square errors and higher linear correlation co- efficients. In addition, IRI discrepancy at the low latitude especially for foF2 is effectively removed by EOF model.

  5. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  6. MESA: Message-Based System Analysis Using Runtime Verification

    Science.gov (United States)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  7. Robust Discriminant Analysis Based on Nonparametric Maximum Entropy

    Science.gov (United States)

    He, Ran; Hu, Bao-Gang; Yuan, Xiao-Tong

    In this paper, we propose a Robust Discriminant Analysis based on maximum entropy (MaxEnt) criterion (MaxEnt-RDA), which is derived from a nonparametric estimate of Renyi’s quadratic entropy. MaxEnt-RDA uses entropy as both objective and constraints; thus the structural information of classes is preserved while information loss is minimized. It is a natural extension of LDA from Gaussian assumption to any distribution assumption. Like LDA, the optimal solution of MaxEnt-RDA can also be solved by an eigen-decomposition method, where feature extraction is achieved by designing two Parzen probability matrices that characterize the within-class variation and the between-class variation respectively. Furthermore, MaxEnt-RDA makes use of high order statistics (entropy) to estimate the probability matrix so that it is robust to outliers. Experiments on toy problem , UCI datasets and face datasets demonstrate the effectiveness of the proposed method with comparison to other state-of-the-art methods.

  8. LSD-based analysis of high-resolution stellar spectra

    Science.gov (United States)

    Tsymbal, V.; Tkachenko, A.; Van, Reeth T.

    2014-11-01

    We present a generalization of the method of least-squares deconvolution (LSD), a powerful tool for extracting high S/N average line profiles from stellar spectra. The generalization of the method is effected by extending it towards the multiprofile LSD and by introducing the possibility to correct the line strengths from the initial mask. We illustrate the new approach by two examples: (a) the detection of astroseismic signatures from low S/N spectra of single stars, and (b) disentangling spectra of multiple stellar objects. The analysis is applied to spectra obtained with 2-m class telescopes in the course of spectroscopic ground-based support for space missions such as CoRoT and Kepler. Usually, rather high S/N is required, so smaller telescopes can only compete successfully with more advanced ones when one can apply a technique that enables a remarkable increase in the S/N of the spectra which they observe. Since the LSD profiles have a potential for reconstruction what is common in all the spectral profiles, it should have a particular practical application to faint stars observed with 2-m class telescopes and whose spectra show remarkable LPVs.

  9. A REGION-BASED MULTI-SCALE APPROACH FOR OBJECT-BASED IMAGE ANALYSIS

    Directory of Open Access Journals (Sweden)

    T. Kavzoglu

    2016-06-01

    Full Text Available Within the last two decades, object-based image analysis (OBIA considering objects (i.e. groups of pixels instead of pixels has gained popularity and attracted increasing interest. The most important stage of the OBIA is image segmentation that groups spectrally similar adjacent pixels considering not only the spectral features but also spatial and textural features. Although there are several parameters (scale, shape, compactness and band weights to be set by the analyst, scale parameter stands out the most important parameter in segmentation process. Estimating optimal scale parameter is crucially important to increase the classification accuracy that depends on image resolution, image object size and characteristics of the study area. In this study, two scale-selection strategies were implemented in the image segmentation process using pan-sharped Qickbird-2 image. The first strategy estimates optimal scale parameters for the eight sub-regions. For this purpose, the local variance/rate of change (LV-RoC graphs produced by the ESP-2 tool were analysed to determine fine, moderate and coarse scales for each region. In the second strategy, the image was segmented using the three candidate scale values (fine, moderate, coarse determined from the LV-RoC graph calculated for whole image. The nearest neighbour classifier was applied in all segmentation experiments and equal number of pixels was randomly selected to calculate accuracy metrics (overall accuracy and kappa coefficient. Comparison of region-based and image-based segmentation was carried out on the classified images and found that region-based multi-scale OBIA produced significantly more accurate results than image-based single-scale OBIA. The difference in classification accuracy reached to 10% in terms of overall accuracy.

  10. Complete chromogen separation and analysis in double immunohistochemical stains using Photoshop-based image analysis.

    Science.gov (United States)

    Lehr, H A; van der Loos, C M; Teeling, P; Gown, A M

    1999-01-01

    Simultaneous detection of two different antigens on paraffin-embedded and frozen tissues can be accomplished by double immunohistochemistry. However, many double chromogen systems suffer from signal overlap, precluding definite signal quantification. To separate and quantitatively analyze the different chromogens, we imported images into a Macintosh computer using a CCD camera attached to a diagnostic microscope and used Photoshop software for the recognition, selection, and separation of colors. We show here that Photoshop-based image analysis allows complete separation of chromogens not only on the basis of their RGB spectral characteristics, but also on the basis of information concerning saturation, hue, and luminosity intrinsic to the digitized images. We demonstrate that Photoshop-based image analysis provides superior results compared to color separation using bandpass filters. Quantification of the individual chromogens is then provided by Photoshop using the Histogram command, which supplies information on the luminosity (corresponding to gray levels of black-and-white images) and on the number of pixels as a measure of spatial distribution. (J Histochem Cytochem 47:119-125, 1999)

  11. Traditional Mold Analysis Compared to a DNA-based Method of Mold Analysis with Applications in Asthmatics' Homes

    Science.gov (United States)

    Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...

  12. Reliability analysis based on the losses from failures.

    Science.gov (United States)

    Todinov, M T

    2006-04-01

    The conventional reliability analysis is based on the premise that increasing the reliability of a system will decrease the losses from failures. On the basis of counterexamples, it is demonstrated that this is valid only if all failures are associated with the same losses. In case of failures associated with different losses, a system with larger reliability is not necessarily characterized by smaller losses from failures. Consequently, a theoretical framework and models are proposed for a reliability analysis, linking reliability and the losses from failures. Equations related to the distributions of the potential losses from failure have been derived. It is argued that the classical risk equation only estimates the average value of the potential losses from failure and does not provide insight into the variability associated with the potential losses. Equations have also been derived for determining the potential and the expected losses from failures for nonrepairable and repairable systems with components arranged in series, with arbitrary life distributions. The equations are also valid for systems/components with multiple mutually exclusive failure modes. The expected losses given failure is a linear combination of the expected losses from failure associated with the separate failure modes scaled by the conditional probabilities with which the failure modes initiate failure. On this basis, an efficient method for simplifying complex reliability block diagrams has been developed. Branches of components arranged in series whose failures are mutually exclusive can be reduced to single components with equivalent hazard rate, downtime, and expected costs associated with intervention and repair. A model for estimating the expected losses from early-life failures has also been developed. For a specified time interval, the expected losses from early-life failures are a sum of the products of the expected number of failures in the specified time intervals covering the

  13. Computational Auditory Scene Analysis Based Perceptual and Neural Principles

    National Research Council Canada - National Science Library

    Wang, DeLiang

    2004-01-01

    .... This fundamental process of auditory perception is called auditory scene analysis. of particular importance in auditory scene analysis is the separation of speech from interfering sounds, or speech segregation...

  14. Electrochemical analysis on poly (ethyl methacrylate)-based ...

    Indian Academy of Sciences (India)

    ... analysis and thermogravimetric/differential thermal analysis, respectively. The membrane that contains EC+ GBL exhibits maximum ionic conductivity of the order of 1.208×10-3 S cm-1 at 303 K. The temperature-dependent ionic conductivity of the polymer membranes has been estimated using AC impedance analysis.

  15. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... bring the solution of fully automatic analysis and understanding of human motion closer....

  16. Phylogenetic diversity analysis of Trichoderma species based on ...

    African Journals Online (AJOL)

    The phylogeny of Trichoderma and the phylogenetic relationships of its species was investigated by maximum parsimony analysis and distance analysis of DNA sequences from multiple genetic loci 18S rDNA sequence analysis suggests that the genus Trichoderma evolved at the same time as Hypomyces and Fusarium ...

  17. Management of chronic pressure ulcers: an evidence-based analysis.

    Science.gov (United States)

    2009-01-01

    In April 2008, the Medical Advisory Secretariat began an evidence-based review of the literature concerning pressure ulcers.Please visit the Medical Advisory Secretariat Web site, http://www.health.gov.on.ca/english/providers/program/mas/tech/tech_mn.html to review these titles that are currently available within the Pressure Ulcers series.PRESSURE ULCER PREVENTION: an evidence based analysisThe cost-effectiveness of prevention strategies for pressure ulcers in long-term care homes in Ontario: projections of the Ontario Pressure Ulcer Model (field evaluation)MANAGEMENT OF CHRONIC PRESSURE ULCERS: an evidence-based analysis The Medical Advisory Secretariat (MAS) conducted a systematic review on interventions used to treat pressure ulcers in order to answer the following questions: Do currently available interventions for the treatment of pressure ulcers increase the healing rate of pressure ulcers compared with standard care, a placebo, or other similar interventions?Within each category of intervention, which one is most effective in promoting the healing of existing pressure ulcers? A pressure ulcer is a localized injury to the skin and/or underlying tissue usually over a bony prominence, as a result of pressure, or pressure in conjunction with shear and/or friction. Many areas of the body, especially the sacrum and the heel, are prone to the development of pressure ulcers. People with impaired mobility (e.g., stroke or spinal cord injury patients) are most vulnerable to pressure ulcers. Other factors that predispose people to pressure ulcer formation are poor nutrition, poor sensation, urinary and fecal incontinence, and poor overall physical and mental health. The prevalence of pressure ulcers in Ontario has been estimated to range from a median of 22.1% in community settings to a median of 29.9% in nonacute care facilities. Pressure ulcers have been shown to increase the risk of mortality among geriatric patients by as much as 400%, to increase the frequency

  18. Sensory analysis and postharvest of potted gerbera based on fertilization

    Directory of Open Access Journals (Sweden)

    Francielly Torres Santos

    2017-01-01

    Full Text Available The cultivation of gerberas as cut flowers has been broadly studied. With the purpose of assess the production and visual quality of potted gerberas conducted with different fertilizations, the experiment was performed in a greenhouse located at UNIOESTE – Campus of Cascavel – Parana - Brazil. The experimental design was done in randomized blocks with four repetitions and five treatments. The treatments were defined based on the fertilization source: mineral (NPK or organic. The organic fertilizations were obtained with the dilution of four organic composts derived from the composting process of agro-industrial wastes in water. The agro-industrial waste, used in different percentages in each treatment, were: grains pre-cleaning residues (corn meal and wheat husk; hatchery waste; flotation sludge; cellulose casing (sausage covering; solid fraction of pig slurry (Piglet Producer Unit and truck washing; coal and ash remaining from boiler; poultry litter and sugarcane bagasse. The growth parameters were evaluated at the commercialization phase (plant height and diameter, stem height, capitulum number, and capitulum and stem diameter and sensory analysis through trait and preference assessment tests. The use of liquid organic fertilizers are a feasible alternative to substitute conventional mineral fertilization. Farmers should observe EC values and K, Ca, Mg ratio for better quality assurance according to the results obtained in T5 . The T5 (Boiler remaining ash + Hatchery residue has a larger amount of boiler remaining ash, whereas the chemical elements in mineral form are already readily available to the plants, this fact may have contributed to better visual development of plants grown in this treatment.

  19. Particulate air pollution and preeclampsia: a source-based analysis.

    Science.gov (United States)

    Dadvand, Payam; Ostro, Bart; Amato, Fulvio; Figueras, Francesc; Minguillón, María-Cruz; Martinez, David; Basagaña, Xavier; Querol, Xavier; Nieuwenhuijsen, Mark

    2014-08-01

    To investigate the association between preeclampsia and maternal exposure to ambient particulate matter (PM) with aerodynamic diameter less than 10 μm (PM10) and 2.5 μm (PM2.5) mass and sources. Our analysis was based on a hospital cohort of pregnant women (N=3182) residing in Barcelona, Spain, during 2003-2005. Positive matrix factorisation source apportionment (PMF2) was used to identify sources of PM10 and PM2.5 samples obtained by an urban background monitor, resulting in detection of eight sources. We further combined traffic-related sources (brake dust, vehicle exhaust and secondary nitrate/organics) to generate an indicator of combined traffic sources. Logistic regression models were developed to estimate the association between preeclampsia and exposure to each PM source and mass separately during the entire pregnancy and trimester one, adjusted for relevant covariates. For the exposure during the entire pregnancy, we found a 44% (95% CI 7% to 94%) and a 80% (95% CI 4% to 211%) increase in the risk of preeclampsia associated with one IQR increase in exposure to PM10 brake dust and combined traffic-related sources, respectively. These findings remained consistent after an alternative source apportionment method (Multilinear Engine (ME2)) was used. The results for PM2.5 mass and sources and also exposure during trimester one were inconclusive. Risk of preeclampsia was associated with exposure to PM10 brake dust and combined traffic-related sources. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  20. The Evidence-Based Practice of Applied Behavior Analysis.

    Science.gov (United States)

    Slocum, Timothy A; Detrich, Ronnie; Wilczynski, Susan M; Spencer, Trina D; Lewis, Teri; Wolfe, Katie

    2014-05-01

    Evidence-based practice (EBP) is a model of professional decision-making in which practitioners integrate the best available evidence with client values/context and clinical expertise in order to provide services for their clients. This framework provides behavior analysts with a structure for pervasive use of the best available evidence in the complex settings in which they work. This structure recognizes the need for clear and explicit understanding of the strength of evidence supporting intervention options, the important contextual factors including client values that contribute to decision making, and the key role of clinical expertise in the conceptualization, intervention, and evaluation of cases. Opening the discussion of EBP in this journal, Smith (The Behavior Analyst, 36, 7-33, 2013) raised several key issues related to EBP and applied behavior analysis (ABA). The purpose of this paper is to respond to Smith's arguments and extend the discussion of the relevant issues. Although we support many of Smith's (The Behavior Analyst, 36, 7-33, 2013) points, we contend that Smith's definition of EBP is significantly narrower than definitions that are used in professions with long histories of EBP and that this narrowness conflicts with the principles that drive applied behavior analytic practice. We offer a definition and framework for EBP that aligns with the foundations of ABA and is consistent with well-established definitions of EBP in medicine, psychology, and other professions. In addition to supporting the systematic use of research evidence in behavior analytic decision making, this definition can promote clear communication about treatment decisions across disciplines and with important outside institutions such as insurance companies and granting agencies.

  1. Population-based meta-analysis of hydrochlorothiazide pharmacokinetics.

    Science.gov (United States)

    Van Wart, Scott A; Shoaf, Susan E; Mallikaarjun, Suresh; Mager, Donald E

    2013-12-01

    Hydrochlorothiazide (HCTZ) is a thiazide diuretic used for the treatment of hypertension and edema associated with fluid overload conditions such as congestive heart failure (CHF). A population-based meta-analysis approach in NONMEM® was used to develop a PK model to characterize the time-course of HCTZ concentrations in plasma and excretion into the urine for healthy subjects and CHF patients. Data from healthy subjects receiving 100 mg of oral HCTZ were supplemented with additional plasma concentration and urinary excretion versus time data published in the literature following administration of oral HCTZ doses ranging from 10 to 500 mg to healthy subjects or patients with renal failure, CHF or hypertension. A two-compartment model with first-order oral absorption, using a Weibull function, and first-order elimination best described HCTZ PK. Creatinine clearance (CLCR ) was a statistically significant predictor of renal clearance (CLR ). Non-renal clearance was estimated to be 2.44 l/h, CLR was 18.3 l/h and T1/2,α was 1.6 h and T1/2,β was 14.8 h for a typical individual with normal renal function (CLCR  = 120 ml/min). However, CLR was reduced to 10.5, 5.47 and 2.70 l/h in mild (CLCR  = 80 ml/min), moderate (CLCR  = 50 ml/min) and severe (CLCR  = 30 ml/min) renal impairment, respectively. Model diagnostics helped to demonstrate that the population PK model reasonably predicts the rate of urinary HCTZ excretion over time using dosing history and estimated CLCR , allowing for the convenient assessment of PK-PD relationships for HCTZ when given alone or in combination with other agents used to treat fluid overload conditions. Copyright © 2013 John Wiley & Sons, Ltd.

  2. Methodology сomparative statistical analysis of Russian industry based on cluster analysis

    Directory of Open Access Journals (Sweden)

    Sergey S. Shishulin

    2017-01-01

    Full Text Available The article is devoted to researching of the possibilities of applying multidimensional statistical analysis in the study of industrial production on the basis of comparing its growth rates and structure with other developed and developing countries of the world. The purpose of this article is to determine the optimal set of statistical methods and the results of their application to industrial production data, which would give the best access to the analysis of the result.Data includes such indicators as output, output, gross value added, the number of employed and other indicators of the system of national accounts and operational business statistics. The objects of observation are the industry of the countrys of the Customs Union, the United States, Japan and Erope in 2005-2015. As the research tool used as the simplest methods of transformation, graphical and tabular visualization of data, and methods of statistical analysis. In particular, based on a specialized software package (SPSS, the main components method, discriminant analysis, hierarchical methods of cluster analysis, Ward’s method and k-means were applied.The application of the method of principal components to the initial data makes it possible to substantially and effectively reduce the initial space of industrial production data. Thus, for example, in analyzing the structure of industrial production, the reduction was from fifteen industries to three basic, well-interpreted factors: the relatively extractive industries (with a low degree of processing, high-tech industries and consumer goods (medium-technology sectors. At the same time, as a result of comparison of the results of application of cluster analysis to the initial data and data obtained on the basis of the principal components method, it was established that clustering industrial production data on the basis of new factors significantly improves the results of clustering.As a result of analyzing the parameters of

  3. An open stylometric system based on multilevel text analysis

    Directory of Open Access Journals (Sweden)

    Maciej Eder

    2017-12-01

    Full Text Available An open stylometric system based on multilevel text analysis Stylometric techniques are usually applied to a limited number of typical tasks, such as authorship attribution, genre analysis, or gender studies. However, they could be applied to several tasks beyond this canonical set, if only stylometric tools were more accessible to users from different areas of the humanities and social sciences. This paper presents a general idea, followed by a fully functional prototype of an open stylometric system that facilitates its wide use through to two aspects: technical and research flexibility. The system relies on a server installation combined with a web-based user interface. This frees the user from the necessity of installing any additional software. At the same time, the system offers a variety of ways in which the input texts can be analysed: they include not only the usual lexical level, but also deep-level linguistic features. This enables a range of possible applications, from typical stylometric tasks to the semantic analysis of text documents. The internal architecture of the system relies on several well-known software packages: a collection of language tools (for text pre-processing, Stylo (for stylometric analysis and Cluto (for text clustering. The paper presents: (1 The idea behind the system from the user’s perspective. (2 The architecture of the system, with a focus on data processing. (3 Features for text description. (4 The use of analytical systems such as Stylo and Cluto. The presentation is illustrated with example applications.   Otwarty system stylometryczny wykorzystujący wielopoziomową analizę języka  Zastosowania metod stylometrycznych na ogół ograniczają się do kilku typowych problemów badawczych, takich jak atrybucja autorska, styl gatunków literackich czy studia nad zróżnicowaniem stylistycznym kobiet i mężczyzn. Z pewnością dałoby się je z powodzeniem zastosować również do wielu innych problem

  4. Joint multifractal analysis based on the partition function approach: analytical analysis, numerical simulation and empirical application

    Science.gov (United States)

    Xie, Wen-Jie; Jiang, Zhi-Qiang; Gu, Gao-Feng; Xiong, Xiong; Zhou, Wei-Xing

    2015-10-01

    Many complex systems generate multifractal time series which are long-range cross-correlated. Numerous methods have been proposed to characterize the multifractal nature of these long-range cross correlations. However, several important issues about these methods are not well understood and most methods consider only one moment order. We study the joint multifractal analysis based on partition function with two moment orders, which was initially invented to investigate fluid fields, and derive analytically several important properties. We apply the method numerically to binomial measures with multifractal cross correlations and bivariate fractional Brownian motions without multifractal cross correlations. For binomial multifractal measures, the explicit expressions of mass function, singularity strength and multifractal spectrum of the cross correlations are derived, which agree excellently with the numerical results. We also apply the method to stock market indexes and unveil intriguing multifractality in the cross correlations of index volatilities.

  5. ANALYSIS OF THE EFFECTIVENESS AND EFFICIENCY OF MANAGEMENT SYSTEMS BASED ON SYSTEM ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Yurij Vasilkov

    2014-09-01

    Full Text Available In this paper we consider the problem of analyzing the effectiveness and efficiency of management systems that are relevant, especially in the implementation at the enterprise requirements of ISO 9001, 14001 and others. Research management system based on a systematic approach focused on the disclosure of its integrative qualities (i.e. systemic, on identifying the variety of relationships and mechanisms for these qualities. It allows to identify the causes of the real state of affairs, to explain the successes and failures. An important aspect of a systematic approach to the analysis of the effectiveness and efficiency of production control management is the multiplicity of "stakeholders" interests involved in the production process in the formation of operational goals and ways to achieve them.

  6. Analysis of Emergency Information Management Research Hotspots Based on Bibliometric and Co-occurrence Analysis

    Directory of Open Access Journals (Sweden)

    Zou Qingyun

    2017-04-01

    Full Text Available [Purpose/significance] Emergency information management is an interdisciplinary field of emergency management and information management. Summarizing the major research output is helpful to strengthen the effective utilization of information resources in emergency management research, and to provide references for the follow-up development and practical exploration of emergency information management research. [Method/process] By retrieving concerned literature from CNKI, this paper used the bibliometric and co-word clustering analysis methods to analyze the domestic emergency management research output. [Result/conclusion] Domestic emergency information management research mainly focuses on five hot topics: disaster emergency information management, crisis information disclosure, emergency information management system, emergency response, wisdom emergency management. China should strengthen the emergency management information base for future theoretical research, and build the emergency information management theoretical framework.

  7. Sentence-based sentiment analysis with domain adaptation capability

    OpenAIRE

    Gezici, Gizem

    2013-01-01

    Sentiment analysis aims to automatically estimate the sentiment in a given text as positive, objective or negative, possibly together with the strength of the sentiment. Polarity lexicons that indicate how positive or negative each term is, are often used as the basis of many sentiment analysis approaches. Domain-specific polarity lexicons are expensive and time-consuming to build; hence, researchers often use a general purpose or domainindependent lexicon as the basis of their analysis. In t...

  8. Feasibility analysis of base compaction specification : [project brief].

    Science.gov (United States)

    2013-02-01

    Appropriate design and construction of the aggregate base layer has significant influence on : structural stability and performance of pavements. Controlling the construction quality of : the granular base layer is important to achieve long-lasting p...

  9. CMS Configuration Editor: GUI based application for user analysis job

    International Nuclear Information System (INIS)

    Cosa, A de

    2011-01-01

    We present the user interface and the software architecture of the Configuration Editor for the CMS experiment. The analysis workflow is organized in a modular way integrated within the CMS framework that organizes in a flexible way user analysis code. The Python scripting language is adopted to define the job configuration that drives the analysis workflow. It could be a challenging task for users, especially for newcomers, to develop analysis jobs managing the configuration of many required modules. For this reason a graphical tool has been conceived in order to edit and inspect configuration files. A set of common analysis tools defined in the CMS Physics Analysis Toolkit (PAT) can be steered and configured using the Config Editor. A user-defined analysis workflow can be produced starting from a standard configuration file, applying and configuring PAT tools according to the specific user requirements. CMS users can adopt this tool, the Config Editor, to create their analysis visualizing in real time which are the effects of their actions. They can visualize the structure of their configuration, look at the modules included in the workflow, inspect the dependences existing among the modules and check the data flow. They can visualize at which values parameters are set and change them according to what is required by their analysis task. The integration of common tools in the GUI needed to adopt an object-oriented structure in the Python definition of the PAT tools and the definition of a layer of abstraction from which all PAT tools inherit.

  10. Security cost analysis in electricity markets based on voltage security criteria and Web-based implementation

    International Nuclear Information System (INIS)

    Chen, H.

    2003-01-01

    This paper presents an efficient and transparent method for electricity market operators to analyze transaction security costs and to quantify the correlation between market operation and power system operation. Rescheduling and take-risk strategies were proposed and discussed with reference to transaction impact computations, thermal and voltage limits and voltage stability criteria. The rescheduling method is associated with an iterative generation dispatch or load curtailment approach to minimize the amount of rescheduling. The take-risk method considered operating risks to facilitate transactions. The SATC concept was also proposed to accurately evaluate transmission congestion. The impact of transaction was calculated using a new sensitivity formula to find the most effective rescheduling direction and the most effective cost distribution. A new pricing method called Nodal Congestion Price was also proposed to determine proper price signals. The paper also presents an Artificial Neural Network (ANN) based short term load forecasting method that considers the effect of price on the load. A web-based prototype was implemented to allow all market participants access to the proposed analysis and pricing techniques. Several case studies have validated the effectiveness of the proposed method which would help independent system operators in determining congestion prices, coordinate transactions and make profitable market decisions

  11. Activity-based cost analysis in catheter-based angiography and interventional radiology

    International Nuclear Information System (INIS)

    Rautio, R.; Keski-Nisula, L.; Paakkala, T.

    2003-01-01

    The aim of this study was to analyse the costs of the interventional radiology unit and to identify the cost factors in the different activities of catheter-based angiographies and interventional radiology. In 1999 the number of procedures in the interventional radiological unit at Tampere University Hospital was 2968; 1601 of these were diagnostic angiographies, 526 endovascular and 841 nonvascular interventions. The costs were analysed by using Activity Based Cost (ABC) analysis. The budget of the interventional unit was approximately 1.8 million Euro. Material costs accounted for 67%, personnel costs for 17%, equipment costs for 14% and premises costs for 2% of this. The most expensive products were endografting of aortic aneurysms, with a mean price of 5291 Euro and embolizations of cerebral aneurysms (4472 Euro). Endografts formed 87.3% of the total costs in endografting and Guglielmi detachable coils accounted for 63.3% of the total costs in embolizations. The material costs formed the majority of the costs, especially in the newest and most complicated endovascular treatments. Despite the high cost of angiography equipment, its share of the costs is minor. In our experience ABC system is suitable for analysing costs in interventional radiology. (orig.)

  12. Ring test assessment of the mKir2.1 growth based assay in Saccharomyces cerevisiae using parametric models and model-free fits

    Czech Academy of Sciences Publication Activity Database

    Hasenbrink, G.; Koláčná, Lucie; Ludwig, J.; Sychrová, Hana; Kschischo, M.; Lichtenberg-Fraté, H.

    2007-01-01

    Roč. 73, č. 5 (2007), s. 1212-1221 ISSN 0175-7598 Grant - others:EC(XE) QLK3-CT2001-00401 Institutional research plan: CEZ:AV0Z50110509 Keywords : mKir2.1 channel * heterologous expression * S. cerevisiae Subject RIV: FR - Pharmacology ; Medidal Chemistry Impact factor: 2.475, year: 2007

  13. Space vector-based analysis of overmodulation in triangle ...

    Indian Academy of Sciences (India)

    The equivalence of triangle-comparison-based pulse width modulation (TCPWM) and space vector based PWM (SVPWM) during linear modulation is well-known. This paper analyses triangle-comparison based PWM techniques (TCPWM) such as sine-triangle PWM (SPWM) and common-mode voltage injection PWM ...

  14. Base pairing in RNA structures: A computational analysis of ...

    Indian Academy of Sciences (India)

    WINTEC

    PDB files using RASMOL18 software. Hydrogen atoms were added to these base pairs using MOLDEN19 software. The sugar portions attached to base pairs in. RNA structures were removed, and C1′ atoms were respectively replaced by hydrogen atoms during model building. The change in base pair geometry on re-.

  15. Sound transmission analysis of MR fluid based-circular sandwich panels: Experimental and finite element analysis

    Science.gov (United States)

    Hemmatian, Masoud; Sedaghati, Ramin

    2017-11-01

    Magnetorheological Fluids (MR) have been recently utilized in sandwich panels to provide variable stiffness and damping to effectively control vibrations. In this study, the sound transmission behavior of MR based-sandwich panels is investigated through development of an efficient finite element model. A clamped circular sandwich panel with elastic face sheets and MR Fluid as the core layer has been considered. A finite element model utilizing circular and annular elements has been developed to derive the governing equations of motion in the finite element form. The transverse velocity is then calculated and utilized to obtain the sound radiated from the panel and subsequently the sound transmission loss. In order to validate the simulated results, a test setup including two anechoic spaces and an electro-magnet has been designed and fabricated. The magnetic flux density generated inside the electromagnet is simulated using magneto-static finite element analysis and validated with the measured magnetic flux density using Gaussmeter. The results from magneto-static analysis is used to derive an approximate polynomial function to evaluate the magnetic flux density as a function of the plate's radius and applied current. The STL and first axisymmetric natural frequency of the MR sandwich panels with aluminum face sheets are simulated and compared with those obtained experimentally. Finally, a parametric study on the effect of applied magnetic field, the thickness of the core layer and the thickness of face sheets on the STL and natural frequency of the adaptive sandwich panel are presented.

  16. Facial Image Analysis Based on Local Binary Patterns: A Survey

    NARCIS (Netherlands)

    Huang, D.; Shan, C.; Ardebilian, M.; Chen, L.

    2011-01-01

    Facial image analysis, including face detection, face recognition,facial expression analysis, facial demographic classification, and so on, is an important and interesting research topic in the computervision and image processing area, which has many important applications such as human-computer

  17. Fusion of optical flow based motion pattern analysis and silhouette classification for person tracking and detection

    NARCIS (Netherlands)

    Tangelder, J.W.H.; Lebert, E.; Burghouts, G.J.; Zon, K. van; Den Uyl, M.J.

    2014-01-01

    This paper presents a novel approach to detect persons in video by combining optical flow based motion analysis and silhouette based recognition. A new fast optical flow computation method is described, and its application in a motion based analysis framework unifying human tracking and detection is

  18. Machine Learning-Based Sentimental Analysis for Twitter Accounts

    Directory of Open Access Journals (Sweden)

    Ali Hasan

    2018-02-01

    Full Text Available Growth in the area of opinion mining and sentiment analysis has been rapid and aims to explore the opinions or text present on different platforms of social media through machine-learning techniques with sentiment, subjectivity analysis or polarity calculations. Despite the use of various machine-learning techniques and tools for sentiment analysis during elections, there is a dire need for a state-of-the-art approach. To deal with these challenges, the contribution of this paper includes the adoption of a hybrid approach that involves a sentiment analyzer that includes machine learning. Moreover, this paper also provides a comparison of techniques of sentiment analysis in the analysis of political views by applying supervised machine-learning algorithms such as Naïve Bayes and support vector machines (SVM.

  19. Acoustic cardiac signals analysis: a Kalman filter–based approach

    Directory of Open Access Journals (Sweden)

    Salleh SH

    2012-06-01

    Full Text Available Sheik Hussain Salleh,1 Hadrina Sheik Hussain,2 Tan Tian Swee,2 Chee-Ming Ting,2 Alias Mohd Noor,2 Surasak Pipatsart,3 Jalil Ali,4 Preecha P Yupapin31Department of Biomedical Instrumentation and Signal Processing, Universiti Teknologi Malaysia, Skudai, Malaysia; 2Centre for Biomedical Engineering Transportation Research Alliance, Universiti Teknologi Malaysia, Johor Bahru, Malaysia; 3Nanoscale Science and Engineering Research Alliance, King Mongkut's Institute of Technology Ladkrabang, Bangkok, Thailand; 4Institute of Advanced Photonics Science, Universiti Teknologi Malaysia, Johor Bahru, MalaysiaAbstract: Auscultation of the heart is accompanied by both electrical activity and sound. Heart auscultation provides clues to diagnose many cardiac abnormalities. Unfortunately, detection of relevant symptoms and diagnosis based on heart sound through a stethoscope is difficult. The reason GPs find this difficult is that the heart sounds are of short duration and separated from one another by less than 30 ms. In addition, the cost of false positives constitutes wasted time and emotional anxiety for both patient and GP. Many heart diseases cause changes in heart sound, waveform, and additional murmurs before other signs and symptoms appear. Heart-sound auscultation is the primary test conducted by GPs. These sounds are generated primarily by turbulent flow of blood in the heart. Analysis of heart sounds requires a quiet environment with minimum ambient noise. In order to address such issues, the technique of denoising and estimating the biomedical heart signal is proposed in this investigation. Normally, the performance of the filter naturally depends on prior information related to the statistical properties of the signal and the background noise. This paper proposes Kalman filtering for denoising statistical heart sound. The cycles of heart sounds are certain to follow first-order Gauss–Markov process. These cycles are observed with additional noise

  20. Ontology-Based Vaccine Adverse Event Representation and Analysis.

    Science.gov (United States)

    Xie, Jiangan; He, Yongqun

    2017-01-01

    Vaccine is the one of the greatest inventions of modern medicine that has contributed most to the relief of human misery and the exciting increase in life expectancy. In 1796, an English country physician, Edward Jenner, discovered that inoculating mankind with cowpox can protect them from smallpox (Riedel S, Edward Jenner and the history of smallpox and vaccination. Proceedings (Baylor University. Medical Center) 18(1):21, 2005). Based on the vaccination worldwide, we finally succeeded in the eradication of smallpox in 1977 (Henderson, Vaccine 29:D7-D9, 2011). Other disabling and lethal diseases, like poliomyelitis and measles, are targeted for eradication (Bonanni, Vaccine 17:S120-S125, 1999).Although vaccine development and administration are tremendously successful and cost-effective practices to human health, no vaccine is 100% safe for everyone because each person reacts to vaccinations differently given different genetic background and health conditions. Although all licensed vaccines are generally safe for the majority of people, vaccinees may still suffer adverse events (AEs) in reaction to various vaccines, some of which can be serious or even fatal (Haber et al., Drug Saf 32(4):309-323, 2009). Hence, the double-edged sword of vaccination remains a concern.To support integrative AE data collection and analysis, it is critical to adopt an AE normalization strategy. In the past decades, different controlled terminologies, including the Medical Dictionary for Regulatory Activities (MedDRA) (Brown EG, Wood L, Wood S, et al., Drug Saf 20(2):109-117, 1999), the Common Terminology Criteria for Adverse Events (CTCAE) (NCI, The Common Terminology Criteria for Adverse Events (CTCAE). Available from: http://evs.nci.nih.gov/ftp1/CTCAE/About.html . Access on 7 Oct 2015), and the World Health Organization (WHO) Adverse Reactions Terminology (WHO-ART) (WHO, The WHO Adverse Reaction Terminology - WHO-ART. Available from: https://www.umc-products.com/graphics/28010.pdf